Junocam: Juno’s Outreach Camera
- First Online:
- Cite this article as:
- Hansen, C.J., Caplinger, M.A., Ingersoll, A. et al. Space Sci Rev (2014). doi:10.1007/s11214-014-0079-x
- 8.6k Downloads
Junocam is a wide-angle camera designed to capture the unique polar perspective of Jupiter offered by Juno’s polar orbit. Junocam’s four-color images include the best spatial resolution ever acquired of Jupiter’s cloudtops. Junocam will look for convective clouds and lightning in thunderstorms and derive the heights of the clouds. Junocam will support Juno’s radiometer experiment by identifying any unusual atmospheric conditions such as hotspots. Junocam is on the spacecraft explicitly to reach out to the public and share the excitement of space exploration. The public is an essential part of our virtual team: amateur astronomers will supply ground-based images for use in planning, the public will weigh in on which images to acquire, and the amateur image processing community will help process the data.
KeywordsJunoJupiterJupiter’s polesJupiter’s atmosphereimages of Jupiter
The scientific themes of the Juno mission are to study the interior, atmosphere, and magnetosphere of Jupiter (Bolton et al., this issue). The spacecraft has been highly optimized for the operation of its seven science instruments, leading to a solar-powered, sun-pointing, spinning design. Such a platform presents challenges for imaging, both from motion blur and pointing geometry. But it was appreciated that visible imaging is an important component of public engagement for any mission. So a visible camera, Junocam, was included primarily for education and public outreach (EPO), funded from the mission’s EPO budget and given a fairly constrained allocation of spacecraft mass resources.
Despite the challenges, Juno’s polar orbit offers a unique vantage point for imaging Jupiter compared to other missions that have orbited or flown by Jupiter. The orbital inclination is high, and the closest point in the orbit, perijove, is near the equator. The orbit plane is nearly perpendicular to the Sun-Jupiter line, so the spacecraft is generally flying along the terminator. The orbit enables observation of the poles at low emission angles, and features close approaches to Jupiter—about 5000 km above the cloud tops at perijove. This allows almost an order of magnitude improvement in resolution compared with Galileo’s best, which is in the range 20–25 km (Little et al. 1999). No imaging was obtained during Galileo’s end-of-mission impact into Jupiter.
The science and EPO objectives evolved from trade studies balancing the unique imaging opportunities arising from the Juno orbit with the constraints of cost and spacecraft resources (mass, volume and power) available for an EPO camera. Observing the pole is one such opportunity, both for EPO and science, and it leads to the requirement of imaging the entire polar region in three colors (red, green and blue) as the spacecraft passes over the poles ±1 h from closest approach. This in turn leads to a field of view requirement of about 60 degrees. Another opportunity is studying the equatorial region at ten times higher spatial resolution than that of Voyager, Galileo, and Cassini. It leads to a resolution requirement of 3 km/pixel at perijove, when the spacecraft is near the equator, and 50 km/pixel when the spacecraft is over the pole.
The third opportunity is to serve as the “eyes” in visible and near-infrared light for other remote sensing instruments on Juno. These include the microwave radiometer (MWR), the ultraviolet imaging spectrograph (UVS), and the Jovian infrared auroral mapper (JIRAM). Since clouds are both signal and noise for these instruments, the ability to image clouds becomes an additional requirement for Junocam. This is addressed by having a filter at 889 nm, which is an absorption band of methane, a well-mixed gas in Jupiter’s atmosphere. High clouds and hazes stand out in 889 nm images, since they reflect more sunlight than the absorbing gas around them. A single methane filter is not a comprehensive cloud detector, but it is the optimal choice given severe constraints of cost, mass, camera sensitivity, and compatibility with the basic camera design.
To optimize the camera design within constrained resources, the camera electronics design is based on that developed earlier for the Mars Science Laboratory (MSL) mission (Edgett et al. 2012). The flexibility inherent in that design allowed the addition of multiband “pushframe” imaging and time-delayed integration (TDI), described in Sect. 3. These enhancements give adequate signal to noise ratio (SNR) in the images despite the spin of the spacecraft. The radiation environment for Juno, while avoiding the worst areas of the jovian radiation belts, is still many times harsher than that of the MSL mission. Substantial extra shielding mass was added to both the optics and electronics and some revision was made to selected parts. While Junocam is only required and qualified to survive for the first three months of the mission (through orbit 8), we expect the degradation of the instrument to be graceful.
The remainder of this introduction is a brief overview of the camera and its field of view during the hours around closest approach. Section 2 gives the expected science return. Section 3 gives the detailed instrument description. Section 4 describes the calibration results. Section 5 describes operations and commanding, and Section 6 gives the outreach plans.
2 Expected Science Return
Junocam takes advantage of Juno’s unique polar orbit, its extremely low-altitude perijove, and its complementary suite of instruments that probe the interior and atmosphere. The orbit enables one to study the atmospheric dynamics, the clouds, and the aurora right up to the pole, which no spacecraft has ever done before. The orbit also enables one to study the equatorial clouds and winds “up close,” with a spatial scale of 3 km per pixel at perijove. Finally, Junocam provides “eyes” in visible light for three other instruments, the microwave radiometer (MWR), which peers through the clouds down to 100 bar levels (Janssen et al. 2014), the ultraviolet imaging spectrograph (UVS), which studies UV auroral emissions in the polar magnetosphere (Gladstone et al. 2014), and the Jovian infrared auroral mapper (JIRAM), which studies IR auroral emissions and the IR emissions emanating from the clouds at all latitudes (Adriani et al. 2014).
Investigate the nature and scale of meteorological phenomena at the poles, filling in the lack of coverage by Voyager, Galileo, and Cassini. Study the circumpolar waves detected by those earlier spacecraft.
Investigate atmospheric phenomena at scales ten times finer than those resolved by Voyager, Galileo, and Cassini.
Serve as the “eyes” for other instruments on the Juno spacecraft by imaging clouds—at the poles, at high altitudes, and at high resolution.
To understand Junocam’s contributions to our knowledge of giant planets, it is good to review what we know and don’t know about them. Jupiter and Saturn come closest to solar composition, but they seem to be enriched in elements heavier than hydrogen and helium by factors ranging from 2 to 10 (Niemann et al. 1998; Mahaffy et al. 2000; Atreya 2010). Despite this enrichment, the heavier elements contribute less than 1 % of the atoms in these planets’ atmospheres and interiors. Oxygen, which is the third most abundant element on the Sun after hydrogen and helium, is a conundrum. It appears as water, but not in the right abundance. The Galileo probe went into a dry place, a so-called 5-micron hot spot, which is a giant hole in the clouds that allows 5-micron thermal emission from warmer, deeper levels to escape (Ortiz et al. 1998). It would be easy to blame water’s poor showing on the meteorology of hot spots, but it would be better to find the water and to understand hot spot meteorology (Bjoraker et al. 1986; Roos-Serote et al. 2004).
Junocam can identify the hot spots and examine them at high spatial resolution (Vasavada et al. 1998). It can look for convective clouds and lightning (Gierasch et al. 2000); it can estimate cloud heights using stereo and methane band imaging, and it can help the JIRAM instrument as it measures temperatures and cloud optical thickness. It will be able to measure small-scale winds if they are large enough (Vasavada et al. 1998). Junocam can define the structure at cloud top level—the belts and zones and their dynamical structures—to correlate with the water and ammonia abundance that MWR is measuring deeper down. Since meteorology affected the water measurement by the Galileo probe, it is important to know the meteorology and how it might affect the water measurement by the Juno MWR.
Junocam’s 3 km per pixel horizontal resolution near perijove is unprecedented. It will allow one to see individual features within thunderstorms. On Earth, thunderstorms are about as wide as they are tall, and range up to 15 km in both dimensions. Within each storm there are smaller-scale features. Juno will be able to see these small structures if they are present. One problem is that thunderstorms are relatively rare on Jupiter. The Galileo imaging system detected 26 lightning storms, typically 1000 km in diameter, on the night side of Jupiter (Little et al. 1999), implying that the typical distance between storms is about 10,000 km. In contrast, Earth has ∼2000 thunderstorm spread over the planet at any one time (Uman 1987), implying that the typical distance between them is ∼500 km. We are discussing thunderstorms because they have small-scale structures, but it is possible that other meteorological features have small-scale structures that Junocam will discover at 3 km per pixel resolution.
Junocam measures the height of the clouds in three ways. One is by stereo imaging; another is from cloud shadows, and the third is by methane band imaging. Most of the time the spacecraft’s spin axis and orbital axis are both pointed toward Earth. As the spacecraft spins, the field of view of the instruments, including Junocam, sweeps repeatedly along the path of the spacecraft on the clouds below. Junocam takes a 4-color image every 60 s, during which time the spacecraft moves along its path by ∼3400 km. At closest approach, a point on the planet seen at nadir (0° emission angle) will be seen at least six other times—at emission angles of ±34°, ±54°, and ±64°, which offers excellent opportunities for measuring the relative heights of clouds. To measure the cloud height variations, they must be at least 3 km, since that is the pixel size at perijove. Such heights are likely, since the scale height in the mid troposphere is 20–25 km. The spacecraft is flying close to the terminator, so cloud shadows will project several times farther than the cloud heights and should be readily observable.
Junocam uses a filter at 889 nm to further measure cloud heights. Methane gas absorbs at this wavelength, so the photons must scatter off a high cloud if they are going to avoid absorption. Places where there are no high clouds look dark because the photons are absorbed. Photons in Junocam’s other three filter bands can scatter off deeper clouds, so the high clouds do not stand out as strongly as they do in the methane band. The level of unit optical depth at 889 nm has been estimated as 540 mbar when clouds are absent (Sanchez-Lavega et al. 2013), so the methane band images will detect clouds and hazes above this level.
In principle, with multiple looks at the same place, the Junocam can measure winds. If the wind speed is 10 m s−1, the cloud displacement over a 2 min time interval will be 1.2 km, which is less than the pixel size at closest approach and therefore hard to detect. But if the wind speed is 40 m s−1 (Vasavada et al. 1998), and if the feature stays in the field of view for 4 min—for 8 rotations of the spacecraft, then the displacement will be 9.6 km, which is measurable. The measurement will be easier if the clouds are at the same height, because then the displacement will not be confused with stereoscopic distortions due to variable cloud heights. When the spacecraft is farther away, e.g., when it is over the pole, the time that a feature is in the field of view is at most 30 min. With a 40 m s−1 wind, the displacement will be 72 km, which is only slightly greater than the 50 km pixel size at that point in the orbit. Nevertheless, looking for motion is worthwhile. We can rely on Earth-based instruments to measure the large-scale winds up to fairly high latitudes, but only Junocam has a chance of measuring winds at the smallest scales and at the poles.
Correlation with observations of thermal structure that will be forthcoming from ground-based support observations of polar regions will also be investigated. One of these is the detailed nature of the flow around the periphery of a broad polar vortex that is defined by a cold airmass, which appears to be coincident with the edges of a high-altitude polar haze that can be detected in the methane filter. The boundaries of both are characterized by an oscillating pattern with wavenumber 5–6. The second of these is an auroral-related heating of the upper stratosphere by high-energy particles that likely originate with the aurora but are not close to the origin of UV or near-infrared auroral emission.
Lightning is the signature of moist convection, which is an important meteorological process and therefore of great interest to the JIRAM and MWR teams. By telling them where moist convection is occurring, Junocam will help them interpret their data. We estimate Junocam might detect one or two flashes per orbital pass. The problem is the frequency of lightning. A single storm flashes about once every 3 s (Little et al. 1999, Table III). Because they are so spread out, there are likely to be no more than several storms in a single Junocam image. The camera uses time-delayed integration to combine up to 100 exposures 3.2 ms in duration for an effective shutter time of 0.32 s. Regions on the planet are imaged 5–6 times as the spacecraft flies over them. If there are 2 or 3 storms in that region, the effective shutter time will be greater than 3 s, and the camera is likely to see a lightning flash or two during one perijove pass. The high spatial resolution is an advantage in separating lightning from cosmic rays hitting the detector. The cosmic rays light up individual pixels, whereas photons from lightning are spread out as they diffuse up through the clouds and have half-widths at half-maxima in the range 45–80 km (Little et al. 1999).
2.2 Jupiter’s Rings
Junocam will be able to detect Jupiter’s main, optically-thick ring. As the spacecraft rotates the ring will pass through the Junocam field of view. The phase angle is generally close to 90∘, which will likely preclude detection of the more tenuous portions of the gossamer and halo rings. It will however offer the opportunity to investigate the structure and phase function of the main ring with a unique perspective from inside the rings as Juno passes through the equatorial plane.
2.3 Galilean Satellites
Opportunities for Junocam observations of the Galilean satellites
Optimal time to image
Time relative to perijove (PJ)
Spatial scale (km/pix)
19 Nov 13:20
PJ4 − 50.75 h
22 Dec 20:36
PJ7 − 41.5 h
22 March 02:52
PJ15 − 6 h
12 April 17:24
PJ17 − 24 h
13 April 04:34
PJ17 − 13 h
7 June 00:49
PJ22 − 3 h
22 July 08:58
31 July 13:00
PJ27 − 11.5 h
31 July 21:21
PJ27 − 3.3 h
24 Sept 08:37
PJ32 − 12.8 h
24 Sept 18:02
PJ32 − 3.3 h
3 Instrument Description
2.642 kg—Camera Head
3.8×3.9×7.5 in—Camera Head
Field of view
IFOV at center of field
The spacecraft spin rate would cause more than a pixel’s worth of image blurring for exposures longer than about 3.2 ms. For the illumination conditions at Jupiter such short exposures would result in unacceptably low SNR, so the camera provides Time-Delayed-Integration (TDI). TDI vertically shifts the image one row each 3.2 ms over the course of the exposure, canceling the scene motion induced by rotation. Up to about 100 TDI steps can be used for the orbital timing case while still maintaining the needed frame rate for frame-to-frame overlap.
The pushframe imaging mode requires additional processing for image reconstruction. First, each exposed frame is read out to the spacecraft and the desired bands are extracted into 128-pixel-high “framelets”, editing out the unused lines between filters which may suffer from spectral crosstalk. After optional summing and compression, the framelets from all of the frames in an image are transmitted to Earth. The MSSS Ground Data System then treats each framelet as an individual image, using spacecraft attitude telemetry to map-project it onto a planetary shape model. Finally, each map-projected framelet is composited into an overall mosaic by spatial location and bandpass to form an output map.
3.2 Camera Head (CH)
3.2.1 Electronics and Detector
The output signal from the CCD is AC-coupled and then amplified. The amplified signal is digitized to 12 bits at a maximum rate of 5 Mpixels/s. For each pixel, both reset and video levels are digitized and then subtracted in the digital domain to perform correlated double sampling (CDS), resulting in a typical 11 bits of dynamic range.
All CH functions are supervised by a single Actel RTSX field-programmable gate array (FPGA). In response to commands from the JDEA, the FPGA generates the CCD clocks, reads samples from the analog-to-digital converter (ADC) and performs digital CDS, and transmits the pixels to the JDEA.
The CH operates using regulated 5 V and ±15 V power provided by the JDEA. A platinum resistance thermometer (PRT) on the camera focal plane is read by the spacecraft to provide temperature knowledge for radiometric calibration. An additional pair of PRTs and redundant etched-foil heaters are attached to the outside of the camera head and thermostatically controlled by the spacecraft.
The CH electronics are laid out as a single rigid-flex printed circuit board with three rigid sections. The sections are sandwiched between housing sections that provide mechanical support and radiation shielding, and the flexible interconnects are enclosed in metal covers. For Junocam, additional radiation shielding was required and was incorporated into the housings, which are made of titanium. An additional copper-tungsten enclosure surrounds the image sensor. The total mass of the CH is about 2.6 kg.
Junocam filter characteristics
3.3 Junocam Digital Electronics Assembly (JDEA)
As originally proposed, Junocam was to have used a copy of the MSL Digital Electronics Assembly (DEA), which takes raw digital image data from the camera head, compresses it in real time, and stores it in a non-volatile memory buffer for later transmission. However, it soon became apparent that the digital electronics used in the DEA (particularly its Xilinx FPGA) would likely suffer too many radiation-induced upsets from the energetic protons trapped in the jovian radiation belts. Since most of the capabilities of the DEA were unneeded for Juno, we designed a new, more radiation-resistant version, called the Junocam DEA or JDEA.
The JDEA provides regulated power to the camera head, implements a minimal command sequencing capability to manage camera head pushframe operation, receives the raw digital image data from the camera head, applies 12-to-8-bit non-linear companding, and stores the image data in a 128 MB internal DRAM buffer. The CH command/data interface is a three-signal Low Voltage Differential Signaling (LVDS) synchronous serial link transmitting commands from JDEA to CH at 2 Mbit/s and a four-signal synchronous 3-bit parallel interface from CH to JDEA at a rate of 30 Mbit/s. The JDEA also contains a command/data interface with the spacecraft, receiving higher-level imaging commands and returning image data. The command interface is a bidirectional asynchronous RS-422 interface running at 57.6 Kbaud; the data interface is a unidirectional three-signal RS-422 synchronous interface running at 20 Mbits/s.
The JDEA uses an Actel RTSX FPGA. Most of the logic design is inherited from the previously built MSSS context imager on the Mars Reconnaissance Orbiter (MRO CTX) and Lunar Reconnaissance Orbiter Camera (LROC). The power subsystem uses Interpoint components and is derived from the MSL design.
The JDEA electronics are laid out as a single rectangular printed circuit board, sandwiched between housing sections that provide mechanical support and radiation shielding. The JDEA housings are aluminum, since considerable radiation shielding is provided by the spacecraft avionics vault.
3.4 Flight Software
As indicated above, there is no software resident in the instrument. All additional processing is performed by software provided by Junocam and running in the spacecraft computer. This software has significant commonality with that previously developed by MSSS for the Mars Odyssey and MRO missions. It is written in ANSI C and uses the VxWorks multitasking facility so that processing can occur when the spacecraft computer is otherwise unoccupied.
The software receives commands to acquire images from the spacecraft’s command sequence engine. Each image command contains parameters such as exposure time, number of TDI stages, number of frames, interframe time, summing, and compression. Optionally, each image can be commanded relative to the spin phase (based on information provided by the spacecraft’s attitude control system) so that only frames that are pointed at the planet need be acquired. The software instructs the JDEA to begin imaging at the appropriate time and then delays until the entire multi-frame image is acquired. It then reads out the JDEA DRAM. The raw image data are stored in spacecraft DRAM and then read out, processed, and formatted for downlinking. Processing consists of frame editing, optional summing, optional median filtering to remove radiation-induced pixel transients, and optional lossy transform-based or lossless predictive image compression.
3.5 Radiation Effects
There are three radiation effects likely to be observable in Junocam images. The first will be persistent hot pixels caused by displacement damage from energetic particle hits to the sensor, primarily trapped protons. Ground testing indicates that at the end of orbit 8, Junocam will have accumulated less than a hundred hot pixels, a number that can be dealt with by onboard median filter processing. There will also be a global increase in dark current, leading to a slow degradation of image quality. We estimate the dark current increase will be less than 2× through the end of orbit 8.
Junocam was calibrated at MSSS in July 2010. Tables and figures in the following sections are the result of that calibration effort.
CCD Testing and Performance Validation: Validate CCD linearity, read noise, full well, gain, bias, and dark current at system level.
Absolute and relative radiometry: Determine conversion between data number (DN) and radiance for each filter; measure system noise equivalent spectral radiance at each wavelength.
Flatfields: Determine flatfield image for each filter.
System Spectral Throughput: Determine relative throughput of system over each filter’s bandpass; also determine rejection band throughput.
MTF/PSF Target Imaging: Measure Modulation Transfer Function (MTF) and Point Spread Function (PSF) at several different TDI levels.
Geometric Mapping Function: Determine the mapping of the angle from the optic axis in front of the lens to pixel position in the focal plane.
4.1 Linearity/Full Well
4.2 Flat Field
4.3 Dark Current and Bias
4.4 Gain and Read Noise
Read noise was determined by measuring the standard deviation of dark difference images. The measured read noise was 16.6 e–.
4.5 Absolute Response
Each pixel produced by the camera is represented by a 12-bit Data Number (DN) with a scale factor of 16.3 electrons/DN. These 12-bit DNs are then converted to 8-bit form using a piecewise-linear transfer function. For normal imaging, the transfer function is set to approximate square-root encoding to preserve the full 12-bit dynamic range in the presence of shot noise. If desired, any power-of-two linear mapping can also be used (divide by 16, 8, 4, 2, or 1) to simplify data processing; any remaining high-order bits are simply discarded.
Junocam expected signal levels, for conditions described in the text
Reference signal (e–)
Typical polar signal (e–)
4.6 Modulation Transfer Function (MTF)
4.7 Geometric Calibration
4.8 Stray Light
Our primary concern regarding stray light was the potential for visible light to leak under the narrowband methane filter. We measured this using the integrating sphere and the Quartz-Tungsten-Halogen (QTH) lamp, with and without an 850 mm long-wave-pass filter in place between the lamp and the sphere input. The signal level in the methane band was about 88 DN without the filter and about 74.3 DN with it, a leakage of about 18.4 %, but there was little evidence of structure in the leakage, so this is mostly leakage in the filter bandpass itself. Stray light from bright sources just outside the field shows no more than 1–2 % of additional signal.
Mounted on the spacecraft, the off-sun angle must be greater than 75∘ to avoid direct illumination of the front lens element by rays not blocked by the sunshade. Ordinarily the spacecraft is oriented such that the solar arrays face the sun, which puts the sun at an angle of 90o from the boresight, so scattered light is not an issue.
The camera head is mounted to the spacecraft using an L-shaped bracket provided by the spacecraft vendor. The camera pointing can be precisely controlled by the use of adjustable shims at the bracket mounting interface. Typically, wide field-of-view systems do not have stringent alignment requirements, but since Junocam uses TDI, its CCD had to be precisely aligned to the spacecraft spin axis so that scene motion is exactly along CCD columns. A precision reflective alignment cube was provided on the optics to control alignment. Prior to instrument delivery, the angle between the cube normal and the CCD columns was measured. This was accomplished by taking a sequence of images of a long straight target that had been leveled relative to gravity while rotating the camera about its boresight and taking successive images. Once the target was aligned with the CCD rows, the camera was rotated again until a direct return to a laser level from the alignment cube face was observed. This angle was provided to the spacecraft vendor for use in alignment during mounting.
Post-mounting alignment verification was then performed by directly imaging a test target with Junocam. The position of each dot on the target was surveyed photogrammetrically in the spacecraft coordinate system, and the image of each dot on the camera focal plane determined by thresholding and centroiding the image. The dot locations in space were then mapped back to focal plane position using a camera model. Analysis of the dot positions shows that the spin axis is within 2 milliradians of the CCD column axis, well under the maximum requirement of 7.8 milliradians imposed by one column of crosstrack drift in 128 TDI steps.
4.10 TDI Polarity
4.11 Cruise Imaging
Junocam imaged the Earth and Moon again during the Earth flyby in October 2013. Results are discussed in Sect. 6.4.
5 Operations and Commanding
5.1 Image Acquisition
Eight parameters need to be set to acquire an image: the number of TDI stages, per-frame exposure time, compression level, companding, color bands to be acquired, number of frames to be taken, interframe times, and summing (0 for red, green and blue filters or 2× for methane). Although companding (going from 12 bits to 8 bits) can be specified, in general the default (square root) will be used.
Because the methane filter requires 2× summing to reach an acceptable SNR, the methane image will be acquired in a separate rotation from the red-green-blue images. It is possible to take individual red, green and blue images if desired, but normally all 3 will be acquired in the same spacecraft rotation.
The interframe (time between frames) is in units of milliseconds. The maximum value is 65 s. When the interframe time is set to zero, the software computes an appropriate value based on the spin rate as provided by the spacecraft’s attitude control system. (In this case the commanded exposure time is ignored and the actual exposure time is computed based on the spin rate and the commanded TDI.)
5.2 Orbital Timeline
At the beginning of the prime mission the spacecraft is almost directly over the north pole at ∼1 h before perijove and over the south pole at ∼1 h after perijove. Most imaging will be done in a two-hour period centered at perijove. For planning purposes, we split this period into 120 discrete imaging opportunities; with the nominal spacecraft spin rate of 2 rpm, a 4-color image can be acquired once per minute. Because of limited downlink data volume and other camera readout limitations, not all of these images can be taken, so the main purpose of the planning process is to select which are most desirable, balanced against the amount of compression.
5.3 Stereo Imaging
At its maximum velocity near periapsis, it takes about 7 min for the subspacecraft point to reach the visible limb. This means that images acquired less than 7 min apart will contain overlap regions seen at different viewing angles, allowing fore-aft stereo imaging. This may permit photogrammetric determination of the vertical structure of the cloudtops, though radiation transients near periapsis could complicate processing. A 4-color image can be obtained once per minute; at the minimum altitude of 4300 km, Juno’s speed is 57 km s−1, which means Junocam could image a point on the planet at emission angles of 0°, ±38°, ±67°, etc. At higher altitudes and slower velocities the number of possible emission angles is greater and the difference between them is less, so one might want to take images less frequently than once per minute.
5.4 Image Data Volume
Junocam image data is stored in dedicated framed partition space in the spacecraft computer. This limits the number of images that can be collected during the hours around perijove when the spacecraft is not returning telemetry in real-time. The Junocam partition is 1181 Mb.
The size of a Junocam image varies depending on the size of Jupiter in the field of view and the type of compression selected. An uncompressed 4-color image requires ∼75 Mb at the moment in time ∼1 h from closest approach that Junocam resolution is better than Cassini. About 15 min later Jupiter fills the Junocam field of view and a 4-color uncompressed image with direct view of the pole requires ∼100 Mb. At perijove a 4-color uncompressed image requires ∼120 Mb. Very roughly that means that ∼10 uncompressed images could be stored in a perijove pass. With compression the number will be substantially higher. We expect to achieve at least a factor of 2 compression, and likely much higher than that at the beginning of the prime mission when accumulated radiation damage is minimal.
6 Outreach Plans
In keeping with the outreach goals for Junocam, we intend to open up our operations so that the world can see how planetary science on a spacecraft is carried out. The general outreach theme for Junocam is “science in a fishbowl”. We will provide insight into the scientific planning process and the factors that influence scientific decisions. As the images are processed and analyzed, the link will be made between science planning and outcomes. The Junocam concept for operations relies on public involvement at three stages: advance planning, image selection, and image processing, as described in the following sections. Although some of the details may change, the inclusion of the public in the operation of Junocam drives our decision-making. With a very small professional operations team, and together with the professional community involved in active observations of Jupiter, we are relying on the public to fill in key pieces of Junocam operation. The public is an essential part of our virtual team.
6.1 Advanced Planning and the Amateur Astronomy Community
Jupiter’s dynamic atmosphere presents an ever-changing pattern of reddish-brown belts and white zones. In order to anticipate what a given image of Jupiter will look like it is necessary to have current, not historic, images. We will rely on the amateur community, as well as the professional community such as those associated with the International Outer Planet Watch, to supply their most up-to-date ground-based pictures.
A web portal will be provided for images from the amateur astronomy community to be posted and organized. The location of some stable features in Jupiter’s dynamic atmosphere such as the Great Red Spot can be predicted well in advance. For others, predictions from the community will be made based on the available data in 2016. [Due to the solar conjunction in September 2016, ground-based observations of Jupiter will not be possible at the beginning of the Juno science mission and will likely not resume for several science orbits. In this timeframe we will rely on predictions from pre-conjunction ground-based images and Junocam images taken in Juno’s capture orbit.]
Based on amateur inputs and early images from Junocam, simulations of possible Junocam image footprints for each upcoming orbit will be posted on the web, for use in image selection.
6.2 Image Selection and Execution
Juno orbital location, resolution, and lighting geometry;
What features of interest are visible;
The degree of radiation effects from both transient effects (which vary with spacecraft latitude) and accumulated damage.
Graduate students will participate in the image selection process. Based on simulations of the series of images for the two hours around periapsis, students will write blog entries promoting specific images, based on science rationales and other factors. A comment system, open to the public, will be provided to encourage a dialog about the merits of particular observations.
The public will be invited to help prioritize image candidates. The prioritized list will be used by the Junocam operations team will use to create camera commands. The public list will be merged with other requests from the Juno science team, and as many images as possible will be acquired until onboard memory dedicated to Junocam images is exhausted. The web portal will provide access to the planning, dialogs, prioritized list, and the final image products, organized by science orbit.
6.3 Downlink and Data Processing
Initial image processing will be carried out at MSSS to construct the most basic image products. There are three types of products: (1) Experiment Data Records (EDRs) in decompressed image framelet order, 8 bits/pixel, as received from the spacecraft; (2) Reduced Data Records (RDRs) in decompanded flat-fielded form, framelet order, 16 bits/pixel; and (3) map-projected images. The data will then be made available via the web. Our goal is to have the images published within days of their receipt on the ground.
For additional products, we will rely on the image data processing amateur community. While basic processing will be done by image processing professionals, we will primarily be encouraging the general public to be creative. Some possible areas of effort are feature tracking, visualizations using other Juno instrument data and/or ground-based observations, methane mapping, and false color. We will encourage the public to provide figure captions and describe their processing techniques. This portion of our outreach plan was tested at the earth flyby.
The web portal will be used to highlight these public contributions. When stereo is available we will encourage the making of visualization products like flyovers. The web portal will also have links to other publicly available papers and reports with scientific analysis of the content of the images.
6.4 The 2013 Earth Flyby
Acquire Junocam images of an extended object to validate expected camera performance and test image reconstruction tools
Test map projection software on an extended object using the Junocam-specific camera model
Provide data to the amateur image processing community and encourage them to produce a variety of products
In order to achieve the third goal images were posted on the MSSS website within minutes after receipt at MSSS. Different stages of processing were available to the public—raw framelets, map-projected images, and preliminary color versions. A tutorial was provided.
The amateur community responded with many beautiful and creative products. A sample of these are highlighted on the NASA Juno website: www.nasa.gov/mission_pages/juno/multimedia/ with a more complete collection posted on missionjuno.swri.edu.
At MSSS, Junocam was designed and built by Jake Schaffner, Paul Otjens, Chris Martin, Hakeem Oluwo and Mike Malin. Operations support is provided by Robert Zimdar and Elsa Jensen. The Junocam optics were developed by Rockwell Collins Optronics, an effort supported by Chris Yarbrough, Charlie Micka, David Wallis and John Fitzpatrick. The team expresses its appreciation to the Juno spacecraft team at Lockheed-Martin Space Systems, especially Jeanne Ladewig, Jennifer Delavan, Valerie Rowland, and Chuck Rasbach. At the Juno Project at JPL our thanks go to Amy Snyder Hale, Mark Boyles, Phil Morton, Tim Koch, Randy Dodge, Michela Muñoz Fernández, Steven Watson and Steve Matousek. Junocam was developed for NASA under contract 1287931 with the Jet Propulsion Laboratory, California Institute of Technology.
Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.