Subau HDS Reduced Data Archive
Using the data for publication
All papers which make use of data taken with Subaru Telescope facilities
should include the following acknowledgment on the title page as a
footnote to the title.
"This research is based [in part] on data collected at Subaru Telescope,
which is operated by the National Astronomical Observatory of Japan.
We are honored and grateful for the opportunity of observing the
Universe from Maunakea, which has the cultural, historical and natural
significance in Hawaii."
More information on publication can be found at the Subaru web site
Publishing Results from Subaru
.
Please also include the following sentense on the title page as a
footnote to the title or in the acknowledgment of the paper.
"[Part of] the data are retrieved from the JVO portal (http://jvo.nao.ac.jp/portal)
operated by the NAOJ"
- 2024-10-16:
HDS spectra which were processed by JVO pipeline can now be visualized by FITS WebQL.
or
Search Result (target name starts with "")
HDS metadata
The metadata of all the HDS FITS files registered in JVO system is available
from the link below. You can select data from the metadata list according to
your interest and download all of them in an automated way by using wget or
other download utitlities.
An example for downloading all the FITS files (spectra with continuum) processed by the JVO HDS pipeline
on Linux OS:
# download hds-meta-PIPE-1.0_240505.psv
# extract URL (FITS data with continuum) from the file.
cat hds-meta-PIPE-1.0_240505.psv | grep -v "#" | awk -F"|" '{print $27}' > url.dat
# download all the data.
wget --content-disposition -i url.dat
Download:
Change Log
- 2024-10-16: HDS spectra which were processed by JVO pipeline can now be visualized by FITS WebQL.
- 2024-06-28: Data processed by JVO HDS pipeline is now available.
The number of newly processed data is 41,968.
- 2024-06-28: The GUI was completely revised.
HDS Reduced Data Archive
- The purpose of this archive is to provide reduced data
taken by Subaru/HDS for quick investigation of the spectra.
- These data were processed through an automated data reduction pipeline
developed by JVO or manually in a semi-automated fashion by Satoshi Kawanomoto
and Miho Ishigaki.
- Summary of the reduction procedure written by M. Ishigaki can be obtained
here
(Japanese only).
- The JVO automated pipeline follows the
HDS Quick Rduction procedure.
- Only data that has been processed with good enough quality is relased in this archive,
but please check the quality of the processed data yourself.
- Verify that aperture extraction is appropriately performed.
- Check the result of wavelength calibration by ThAr lamp.
- Check the continum normalization is appropriately performed, if you use the normalized spectrum.
- Metadata of all the processed data and URLs to retrieve the data are available
from the "Download" tab.
- JVO pipeline makes use of
iraf,
pyraf,
hds_iraf,
NOIR LAB ThAr Spectral Atlas,
FTOOLS,
and ROOT.
We would like to thank the developers of those software for making them available to the public.
- Should you have any questions please feel free to contact us at :
.