Set to True if the table contains unsigned integer columns.
ver int > 0 or None , optional
The ver of the HDU, will be the value of the keyword EXTVER . If not given or None, it defaults to the value of the EXTVER card of the header or 1. (default: None)
character_as_bytes bool
Whether to return bytes for string columns. By default this is False and (unicode) strings are returned, but this does not respect memory mapping and loads the whole column in memory when accessed.
add_checksum ( when = None , override_datasum = False , checksum_keyword = 'CHECKSUM' , datasum_keyword = 'DATASUM' ) #
Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.
Parameters : when str , optional
comment string for the cards; by default the comments will represent the time when the checksum was calculated
override_datasum bool , optional
add the CHECKSUM card only
checksum_keyword str , optional
The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used
datasum_keyword str , optional
For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True . This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.
add_datasum ( when = None , datasum_keyword = 'DATASUM' ) #
Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.
Parameters : when str , optional
Comment string for the card that by default represents the time when the checksum was calculated
datasum_keyword str , optional
The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used
Returns : checksum int
The calculated datasum
For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.
The ColDefs objects describing the columns in this table.
Make a copy of the table HDU, both header and data are copied.
dump ( datafile = None , cdfile = None , hfile = None , overwrite = False ) [source] #
Dump the table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.
Parameters : datafile path-like object or file-like object , optional
Output data file. The default is the root name of the fits file associated with this HDU appended with the extension .txt .
cdfile path-like object or file-like object , optional
Output column definitions file. The default is None , no column definitions output is produced.
hfile path-like object or file-like object , optional
Output header parameters file. The default is None , no header parameters output is produced.
overwrite bool , optional
If True , overwrite the output file if it exists. Raises an OSError if False and the output file exists. Default is False .
The primary use for the dump method is to allow viewing and editing the table data and parameters in a standard text editor. The load method can be used to create a new table from the three plain text (ASCII) files.
Note This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
Calculates and returns the number of bytes that this HDU will write to a file.
Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList .
The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.
File object associated with the HDU
Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
Starting byte location of header in file
Starting byte location of data block in file
Data size including padding
classmethod from_columns ( columns , header = None , nrows = 0 , fill = False , character_as_bytes = False , ** kwargs ) #
Given either a ColDefs object, a sequence of Column objects, or another table HDU or table data (a FITS_rec or multi-field numpy.ndarray or numpy.recarray object, return a new table HDU of the class this method was called on using the column definition from the input.
The columns from which to create the table data, or an object with a column-like structure from which a ColDefs can be instantiated. This includes an existing BinTableHDU or TableHDU , or a numpy.recarray to give some examples.
If these columns have data arrays attached that data may be used in initializing the new table. Otherwise the input columns will be used as a template for a new table with the requested number of rows.
header Header
An optional Header object to instantiate the new HDU yet. Header keywords specifically related to defining the table structure (such as the “TXXXn” keywords like TTYPEn) will be overridden by the supplied column definitions, but all other informational and data model-specific keywords are kept.
nrows int
Number of rows in the new table. If the input columns have data associated with them, the size of the largest input column is used. Otherwise the default is 0.
fill bool
If True , will fill all cells with zeros or blanks. If False , copy the data from input, undefined cells will still be filled with zeros/blanks.
character_as_bytes bool
Whether to return bytes for string columns when accessed from the HDU. By default this is False and (unicode) strings are returned, but for large tables this may use up a lot of memory.
Any additional keyword arguments accepted by the HDU class’s __init__ may also be passed in as keyword arguments.
classmethod fromstring ( data , checksum = False , ignore_missing_end = False , ** kwargs ) #
Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.
Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview .
A byte string containing the HDU’s header and data.
checksum bool , optional
Check the HDU’s checksum and/or datasum.
ignore_missing_end bool , optional
Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.
**kwargs optional
May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as PrimaryHDU , ImageHDU , or BinTableHDU . Any unrecognized keyword arguments are simply ignored.
classmethod load ( datafile , cdfile = None , hfile = None , replace = False , header = None ) [source] #
Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data.
The column definition and header parameters files are not required. When absent the column definitions and/or header parameters are taken from the header object given in the header argument; otherwise sensible defaults are inferred (though this mode is not recommended).
Parameters : datafile path-like object or file-like object
Input data file containing the table data in ASCII format.
cdfile path-like object or file-like object , optional
Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table. If None , the column definitions are taken from the current values in this object.
hfile path-like object or file-like object , optional
Input parameter definition file containing the header parameter definitions to be associated with the table. If None , the header parameter definitions are taken from the current values in this objects header.
replace bool , optional
When True , indicates that the entire header should be replaced with the contents of the ASCII file instead of just updating the current header.
header Header , optional
When the cdfile and hfile are missing, use this Header object in the creation of the new table and HDU. Otherwise this Header supersedes the keywords from hfile, which is only used to update values not present in this Header, unless replace=True in which this Header’s values are completely replaced with the values from hfile.
The primary use for the load method is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. The dump method can be used to create the initial ASCII files.
Note This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
This is an abstract type that implements the shared functionality of the ASCII and Binary Table HDU types, which should be used instead of this.
classmethod readfrom ( fileobj , checksum = False , ignore_missing_end = False , ** kwargs ) #
Read the HDU from a file. Normally an HDU should be opened with open() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto() .
Parameters : fileobj file-like object
Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.
checksum bool
If True , verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.
ignore_missing_end bool
Do not issue an exception when opening a file that is missing an END card in the last header.
req_cards ( keyword , pos , test , fix_value , option , errlist ) #
Check the existence, location, and value of a required Card .
Parameters : keyword str
The keyword to validate
If an int , this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this means pos=0 requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and return True or False . This can be used for custom evaluation. For example if pos=lambda idx: idx > 10 this will check that the keyword’s index is greater than 10.
This should be a callable (generally a function) that is passed the value of the given keyword and returns True or False . This can be used to validate the value associated with the given keyword.
A valid value for a FITS keyword to use if the given test fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. If None , there is no replacement value and the keyword is unfixable.
option str
Output verification option. Must be one of "fix" , "silentfix" , "ignore" , "warn" , or "exception" . May also be any combination of "fix" or "silentfix" with "+ignore" , +warn , or +exception" (e.g. ``"fix+warn" ). See Verification Options for more info.
errlist list
A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to req_cards .
If pos=None , the card can be anywhere in the header. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.
run_option ( option = 'warn' , err_text = '' , fix_text = 'Fixed.' , fix = None , fixable = True ) #
Execute the verification with selected option.
Size (in bytes) of the data portion of the HDU.
Deprecated since version v6.0: The update function is deprecated and may be removed in a future version. Use update_header instead.
Update header keywords to reflect recent changes of columns.
verify ( option = 'warn' ) #
Verify all values in the instance.
Parameters : option str
Output verification option. Must be one of "fix" , "silentfix" , "ignore" , "warn" , or "exception" . May also be any combination of "fix" or "silentfix" with "+ignore" , "+warn" , or "+exception" (e.g. "fix+warn" ). See Verification Options for more info.
Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.
Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.
Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).
class astropy.io.fits. TableHDU ( data = None , header = None , name = None , ver = None , character_as_bytes = False ) [source] #
FITS ASCII table extension HDU class.
Parameters : data array or FITS_rec
Data to be used.
header Header
Header to be used.
name str
Name to be populated in EXTNAME keyword.
ver int > 0 or None , optional
The ver of the HDU, will be the value of the keyword EXTVER . If not given or None, it defaults to the value of the EXTVER card of the header or 1. (default: None)
character_as_bytes bool
Whether to return bytes for string columns. By default this is False and (unicode) strings are returned, but this does not respect memory mapping and loads the whole column in memory when accessed.
add_checksum ( when = None , override_datasum = False , checksum_keyword = 'CHECKSUM' , datasum_keyword = 'DATASUM' ) #
Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.
Parameters : when str , optional
comment string for the cards; by default the comments will represent the time when the checksum was calculated
override_datasum bool , optional
add the CHECKSUM card only
checksum_keyword str , optional
The name of the header keyword to store the checksum value in; this is typically ‘CHECKSUM’ per convention, but there exist use cases in which a different keyword should be used
datasum_keyword str , optional
For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True . This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.
add_datasum ( when = None , datasum_keyword = 'DATASUM' ) #
Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.
Parameters : when str , optional
Comment string for the card that by default represents the time when the checksum was calculated
datasum_keyword str , optional
The name of the header keyword to store the datasum value in; this is typically ‘DATASUM’ per convention, but there exist use cases in which a different keyword should be used
Returns : checksum int
The calculated datasum
For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.
The ColDefs objects describing the columns in this table.
Make a copy of the table HDU, both header and data are copied.
Calculates and returns the number of bytes that this HDU will write to a file.
Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList .
The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.
File object associated with the HDU
Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)
Starting byte location of header in file
Starting byte location of data block in file
Data size including padding
classmethod from_columns ( columns , header = None , nrows = 0 , fill = False , character_as_bytes = False , ** kwargs ) #
Given either a ColDefs object, a sequence of Column objects, or another table HDU or table data (a FITS_rec or multi-field numpy.ndarray or numpy.recarray object, return a new table HDU of the class this method was called on using the column definition from the input.
The columns from which to create the table data, or an object with a column-like structure from which a ColDefs can be instantiated. This includes an existing BinTableHDU or TableHDU , or a numpy.recarray to give some examples.
If these columns have data arrays attached that data may be used in initializing the new table. Otherwise the input columns will be used as a template for a new table with the requested number of rows.
header Header
An optional Header object to instantiate the new HDU yet. Header keywords specifically related to defining the table structure (such as the “TXXXn” keywords like TTYPEn) will be overridden by the supplied column definitions, but all other informational and data model-specific keywords are kept.
nrows int
Number of rows in the new table. If the input columns have data associated with them, the size of the largest input column is used. Otherwise the default is 0.
fill bool
If True , will fill all cells with zeros or blanks. If False , copy the data from input, undefined cells will still be filled with zeros/blanks.
character_as_bytes bool
Whether to return bytes for string columns when accessed from the HDU. By default this is False and (unicode) strings are returned, but for large tables this may use up a lot of memory.
Any additional keyword arguments accepted by the HDU class’s __init__ may also be passed in as keyword arguments.
classmethod fromstring ( data , checksum = False , ignore_missing_end = False , ** kwargs ) #
Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.
Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview .
A byte string containing the HDU’s header and data.
checksum bool , optional
Check the HDU’s checksum and/or datasum.
ignore_missing_end bool , optional
Ignore a missing end card in the header data. Note that without the end card the end of the header may be ambiguous and resulted in a corrupt HDU. In this case the assumption is that the first 2880 block that does not begin with valid FITS header data is the beginning of the data.
**kwargs optional
May consist of additional keyword arguments specific to an HDU type–these correspond to keywords recognized by the constructors of different HDU classes such as PrimaryHDU , ImageHDU , or BinTableHDU . Any unrecognized keyword arguments are simply ignored.
This is an abstract type that implements the shared functionality of the ASCII and Binary Table HDU types, which should be used instead of this.
classmethod readfrom ( fileobj , checksum = False , ignore_missing_end = False , ** kwargs ) #
Read the HDU from a file. Normally an HDU should be opened with open() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto() .
Parameters : fileobj file-like object
Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.
checksum bool
If True , verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.
ignore_missing_end bool
Do not issue an exception when opening a file that is missing an END card in the last header.
req_cards ( keyword , pos , test , fix_value , option , errlist ) #
Check the existence, location, and value of a required Card .
Parameters : keyword str
The keyword to validate
If an int , this specifies the exact location this card should have in the header. Remember that Python is zero-indexed, so this means pos=0 requires the card to be the first card in the header. If given a callable, it should take one argument–the actual position of the keyword–and return True or False . This can be used for custom evaluation. For example if pos=lambda idx: idx > 10 this will check that the keyword’s index is greater than 10.
This should be a callable (generally a function) that is passed the value of the given keyword and returns True or False . This can be used to validate the value associated with the given keyword.
A valid value for a FITS keyword to use if the given test fails to replace an invalid value. In other words, this provides a default value to use as a replacement if the keyword’s current value is invalid. If None , there is no replacement value and the keyword is unfixable.
option str
Output verification option. Must be one of "fix" , "silentfix" , "ignore" , "warn" , or "exception" . May also be any combination of "fix" or "silentfix" with "+ignore" , +warn , or +exception" (e.g. ``"fix+warn" ). See Verification Options for more info.
errlist list
A list of validation errors already found in the FITS file; this is used primarily for the validation system to collect errors across multiple HDUs and multiple calls to req_cards .
If pos=None , the card can be anywhere in the header. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.
run_option ( option = 'warn' , err_text = '' , fix_text = 'Fixed.' , fix = None , fixable = True ) #
Execute the verification with selected option.
Size (in bytes) of the data portion of the HDU.
Deprecated since version v6.0: The update function is deprecated and may be removed in a future version. Use update_header instead.
Update header keywords to reflect recent changes of columns.
verify ( option = 'warn' ) #
Verify all values in the instance.
Parameters : option str
Output verification option. Must be one of "fix" , "silentfix" , "ignore" , "warn" , or "exception" . May also be any combination of "fix" or "silentfix" with "+ignore" , "+warn" , or "+exception" (e.g. "fix+warn" ). See Verification Options for more info.
Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.
Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.
Works similarly to the normal writeto(), but prepends a default PrimaryHDU are required by extension HDUs (which cannot stand on their own).
class astropy.io.fits. Column ( name = None , format = None , unit = None , null = None , bscale = None , bzero = None , disp = None , start = None , dim = None , array = None , ascii = None , coord_type = None , coord_unit = None , coord_ref_point = None , coord_ref_value = None , coord_inc = None , time_ref_pos = None ) [source] #
Class which contains the definition of one column, e.g. ttype , tform , etc. and the array containing values for the column.
Construct a Column by specifying attributes. All attributes except format can be optional; see Column Creation and Creating an ASCII Table for more information regarding TFORM keyword.
Parameters : name str , optional
column name, corresponding to TTYPE keyword
format str
column format, corresponding to TFORM keyword
unit str , optional
column unit, corresponding to TUNIT keyword
null str , optional
null value, corresponding to TNULL keyword
bscale int-like, optional
bscale value, corresponding to TSCAL keyword
bzero int-like, optional
bzero value, corresponding to TZERO keyword
disp str , optional
display format, corresponding to TDISP keyword
start int , optional
column starting position (ASCII table only), corresponding to TBCOL keyword
dim str , optional
column dimension corresponding to TDIM keyword
array iterable , optional
a list , numpy.ndarray (or other iterable that can be used to initialize an ndarray) providing initial data for this column. The array will be automatically converted, if possible, to the data format of the column. In the case were non-trivial bscale and/or bzero arguments are given, the values in the array must be the physical values–that is, the values of column as if the scaling has already been applied (the array stored on the column object will then be converted back to its storage values).
ascii bool , optional
set True if this describes a column for an ASCII table; this may be required to disambiguate the column format
coord_type str , optional
coordinate/axis type corresponding to TCTYP keyword
coord_unit str , optional
coordinate/axis unit corresponding to TCUNI keyword
coord_ref_point int-like, optional
pixel coordinate of the reference point corresponding to TCRPX keyword
coord_ref_value int-like, optional
coordinate value at reference point corresponding to TCRVL keyword
coord_inc int-like, optional
coordinate increment at reference point corresponding to TCDLT keyword
time_ref_pos str , optional
reference position for a time coordinate column corresponding to TRPOS keyword
The Numpy ndarray associated with this Column .
If the column was instantiated with an array passed to the array argument, this will return that array. However, if the column is later added to a table, such as via BinTableHDU.from_columns as is typically the case, this attribute will be updated to reference the associated field in the table, which may no longer be the same array.
Whether this Column represents a column in an ASCII table.
Return a copy of this Column .
Column definitions class.
It has attributes corresponding to the Column attributes (e.g. ColDefs has the attribute names while Column has name ). Each attribute in ColDefs is a list of corresponding attribute values from all Column objects.
Parameters : input sequence of Column or ColDefs or ndarray or recarray
An existing table HDU, an existing ColDefs , or any multi-field Numpy array or numpy.recarray .
ascii bool
Use True to ensure that ASCII table columns are used.
Append one Column to the column definition.
change_attrib ( col_name , attrib , new_value ) [source] #
Change an attribute (in the KEYWORD_ATTRIBUTES list) of a Column .
Parameters : col_name str or int
The column name or index to change
attrib str
The attribute name
new_value object
The new value for the attribute
Change a Column ’s name.
Parameters : col_name str
The current name of the column
new_name str
The new name of the column
Change a Column ’s unit.
Parameters : col_name str or int
The column name or index
new_unit str
The new unit for the column
Delete (the definition of) one Column .
col_name str or int
The column’s name or index
Get attribute(s) information of the column definition.
Parameters : attrib str
Can be one or more of the attributes listed in astropy.io.fits.column.KEYWORD_ATTRIBUTES . The default is "all" which will print out all attributes. It forgives plurals and blanks. If there are two or more attribute names, they must be separated by comma(s).
output file-like object , optional
File-like object to output to. Outputs to stdout by default. If False , returns the attributes as a dict instead.
This function doesn’t return anything by default; it just prints to stdout.
FITS record array class.
FITS_rec is the data part of a table HDU’s data part. This is a layer over the recarray , so we can deal with scaled columns.
It inherits all of the standard methods from numpy.ndarray .
Construct a FITS record array from a recarray.
A user-visible accessor for the coldefs.
The Numpy documentation lies; numpy.ndarray.copy is not equivalent to numpy.copy . Differences include that it re-views the copied array as self’s ndarray subclass, as though it were taking a slice; this means __array_finalize__ is called and the copy shares all the array attributes (including ._converted !). So we need to make a deep copy of all those attributes so that the two arrays truly do not share any data.
A view of a Column ’s data as an array.
List of column FITS formats.
classmethod from_columns ( columns , nrows = 0 , fill = False , character_as_bytes = False ) [source] #
Given a ColDefs object of unknown origin, initialize a new FITS_rec object.
This was originally part of the new_table function in the table module but was moved into a class method since most of its functionality always had more to do with initializing a FITS_rec object than anything else, and much of it also overlapped with FITS_rec._scale_back .
Parameters : columns sequence of Column or a ColDefsThe columns from which to create the table data. If these columns have data arrays attached that data may be used in initializing the new table. Otherwise the input columns will be used as a template for a new table with the requested number of rows.
nrows int
Number of rows in the new table. If the input columns have data associated with them, the size of the largest input column is used. Otherwise the default is 0.
fill bool
If True , will fill all cells with zeros or blanks. If False , copy the data from input, undefined cells will still be filled with zeros/blanks.
List of column names.
Return the array as an a.ndim -levels deep nested list of Python scalars.
Return a copy of the array data as a (nested) Python list. Data items are converted to the nearest compatible builtin Python type, via the item function.
If a.ndim is 0, then since the depth of the nested list is 0, it will not be a list at all, but a simple Python scalar.
Parameters : none Returns : y object , or list of object , or list of list of object , or …
The possibly nested list of array elements.
The array may be recreated via a = np.array(a.tolist()) , although this may sometimes lose precision.
For a 1D array, a.tolist() is almost the same as list(a) , except that tolist changes numpy scalars to Python scalars:
>>> a = np.uint32([1, 2]) >>> a_list = list(a) >>> a_list [1, 2] >>> type(a_list[0]) >>> a_tolist = a.tolist() >>> a_tolist [1, 2] >>> type(a_tolist[0])Additionally, for a 2D array, tolist applies recursively:
>>> a = np.array([[1, 2], [3, 4]]) >>> list(a) [array([1, 2]), array([3, 4])] >>> a.tolist() [[1, 2], [3, 4]]
The base case for this recursion is a 0D array:
>>> a = np.array(1) >>> list(a) Traceback (most recent call last): . TypeError: iteration over a 0-d array >>> a.tolist() 1
class astropy.io.fits. FITS_record ( input , row = 0 , start = None , end = None , step = None , base = None , ** kwargs ) [source] #
FITS record class.
FITS_record is used to access records of the FITS_rec object. This will allow us to deal with scaled columns. It also handles conversion/scaling of columns in ASCII tables. The FITS_record class expects a FITS_rec object as input.
Parameters : input array
The array to wrap.
row int , optional
The starting logical row of the array.
start int , optional
The starting column in the row associated with this object. Used for subsetting the columns of the FITS_rec object.
end int , optional
The ending column in the row associated with this object. Used for subsetting the columns of the FITS_rec object.
Get the field data of the record.
Set the field data of the record.
astropy.io.fits. tabledump ( filename , datafile = None , cdfile = None , hfile = None , ext = 1 , overwrite = False ) [source] #
Dump a table HDU to a file in ASCII format. The table may be dumped in three separate files, one containing column definitions, one containing header parameters, and one for table data.
Parameters : filename path-like object or file-like object
Input fits file.
datafile path-like object or file-like object , optional
Output data file. The default is the root name of the input fits file appended with an underscore, followed by the extension number (ext), followed by the extension .txt .
cdfile path-like object or file-like object , optional
Output column definitions file. The default is None , no column definitions output is produced.
hfile path-like object or file-like object , optional
Output header parameters file. The default is None , no header parameters output is produced.
ext int
The number of the extension containing the table HDU to be dumped.
overwrite bool , optional
If True , overwrite the output file if it exists. Raises an OSError if False and the output file exists. Default is False .
The primary use for the tabledump function is to allow editing in a standard text editor of the table data and parameters. The tableload function can be used to reassemble the table from the three ASCII files.
Note This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
Create a table from the input ASCII files. The input is from up to three separate files, one containing column definitions, one containing header parameters, and one containing column data. The header parameters file is not required. When the header parameters file is absent a minimal header is constructed.
Parameters : datafile path-like object or file-like object
Input data file containing the table data in ASCII format.
Input column definition file containing the names, formats, display formats, physical units, multidimensional array dimensions, undefined values, scale factors, and offsets associated with the columns in the table.
hfile path-like object or file-like object , optional
Input parameter definition file containing the header parameter definitions to be associated with the table. If None , a minimal header is constructed.
The primary use for the tableload function is to allow the input of ASCII data that was edited in a standard text editor of the table data and parameters. The tabledump function can be used to create the initial ASCII files.
Note This format does not support variable length arrays using the (‘Q’ format) due to difficult to overcome ambiguities. What this means is that this file format cannot support VLA columns in tables stored in files that are over 2 GB in size.
Convert an Table object to a FITS BinTableHDU .
Parameters : table astropy.table.Table
The table to convert.
character_as_bytes bool
Whether to return bytes for string columns when accessed from the HDU. By default this is False and (unicode) strings are returned, but for large tables this may use up a lot of memory.
Returns : table_hdu BinTableHDU
The FITS binary table HDU.