Robotic telescope
A robotic telescope is an astronomical telescope and detector system that makes observations without the intervention of a human. In astronomical disciplines, a telescope qualifies as robotic if it makes those observations without being operated by a human, even if a human has to initiate the observations at the beginning of the night, or end them in the morning. A robotic telescope is distinct from a remote telescope, though an instrument can be both robotic and remote.
Design
Robotic telescopes are complex systems that typically incorporate a number of subsystems. These subsystems include devices that provide telescope pointing capability, operation of the detector (typically a CCD camera), control of the dome or telescope enclosure, control over the telescope's focuser, detection of weather conditions, and other capabilities. Frequently these varying subsystems are presided over by a master control system, which is almost always a software component.
Robotic telescopes operate under closed loop or open loop principles. In an open loop system, a robotic telescope system points itself and collects its data without inspecting the results of its operations to ensure it is operating properly. An open loop telescope is sometimes said to be operating on faith, in that if something goes wrong, there is no way for the control system to detect it and compensate.
A closed loop system has the capability to evaluate its operations through redundant inputs to detect errors. A common such input would be position encoders on the telescope's axes of motion, or the capability of evaluating the system's images to ensure it was pointed at the correct field of view when they were exposed.
Most robotic telescopes are small telescopes. While large observatory instruments may be highly automated, few are operated without attendants.
History of professional robotic telescopes
Robotic telescopes were first developed by astronomers after electromechanical interfaces to computers became common at observatories. Early examples were expensive, had limited capabilities, and included a large number of unique subsystems, both in hardware and software. This contributed to a lack of progress in the development of robotic telescopes early in their history.
By the early 1980s, with the availability of cheap computers, several viable robotic telescope projects were conceived, and a few were developed. The 1985 book, Microcomputer Control of Telescopes, by Mark Trueblood and Russell M. Genet, was a landmark engineering study in the field. One of this book's achievements was pointing out many reasons, some quite subtle, why telescopes could not be reliably pointed using only basic astronomical calculations. The concepts explored in this book share a common heritage with the telescope mount error modeling software called Tpoint, which emerged from the first generation of large automated telescopes in the 1970s, notably the 3.9m Anglo-Australian Telescope.
Since the late 1980s, the University of Iowa has been in the forefront of robotic telescope development on the professional side. The Automated Telescope Facility (ATF), developed in the early 1990s, was located on the roof of the physics building at the University of Iowa in Iowa City. They went on to complete the Iowa Robotic Observatory, a robotic and remote telescope at the private Winer Observatory in 1997. This system successfully observed variable stars and contributed observations to dozens of scientific papers. In May 2002, they completed the Rigel Telescope. The Rigel was a 0.37-meter (14.5-inch) F/14 built by Optical Mechanics, Inc. and controlled by the Talon program.[1] Each of these was a progression toward a more automated and utilitarian observatory.
One of the largest current networks of robotic telescopes is RoboNet, operated by a consortium of UK universities. The Lincoln Near-Earth Asteroid Research (LINEAR) Project is another example of a professional robotic telescope. LINEAR's competitors, the Lowell Observatory Near-Earth-Object Search, Catalina Sky Survey, Spacewatch, and others, have also developed varying levels of automation.
In 2002, the RAPid Telescopes for Optical Response (RAPTOR) project pushed the envelope of automated robotic astronomy by becoming the first fully autonomous closed–loop robotic telescope. RAPTOR was designed in 2000 and began full deployment in 2002. Theproject was headed by Tom Vestrand and his team: James Wren, Robert White, P. Wozniak, and Heath Davis. Its first light on one of the wide field instruments was in late 2001, with the second wide field system came online in late 2002. Closed loop operations began in 2003. Originally the goal of RAPTOR was to develop a system of ground-based telescopes that would reliably respond to satellite triggers and more importantly, identify transients in real-time and generate alerts with source locations to enable follow-up observations with other, larger, telescopes. It has achieved both of these goals quite successfully. Now RAPTOR has been re-tuned to be the key hardware element of the Thinking Telescopes Technologies Project. Its new mandate will be the monitoring of the night sky looking for interesting and anomalous behaviors in persistent sources using some of the most advanced robotic software ever deployed. The two wide field systems are a mosaic of CCD cameras. The mosaic covers and area of approximately 1500 square degrees to a depth of 12th magnitude. Centered in each wide field array is a single fovea system with a field of view of 4 degrees and depth of 16th magnitude. The wide field systems are separated by a 38 km baseline. Supporting these wide field systems are two other operational telescopes. The first of these is a cataloging patrol instrument with a mosaic 16 square degree field of view down to 16 magnitude. The other system is a .4m OTA with a yielding a depth of 19-20th magnitude and a coverage of .35 degrees. Three additional systems are currently undergoing development and testing and deployment will be staged over the next two years. All of the systems are mounted on custom manufactured, fast-slewing mounts capable of reaching any point in the sky in 3 seconds. The RAPTOR System is located on site at Los Alamos National Laboratory (USA) and has been supported through the Laboratory's Directed Research and Development funds.
In 2004, some professional robotic telescopes were characterized by a lack of design creativity and a reliance on closed source and proprietary software. The software is usually unique to the telescope it was designed for and cannot be used on any other system. Often, robotic telescope software developed at universities becomes impossible to maintain and ultimately obsolete because the graduate students who wrote it move on to new positions, and their institutions lose their knowledge. Large telescope consortia or government funded laboratories don't tend to have this same loss of developers as experienced by universities. Professional systems generally feature very high observing efficiency and reliability. There is also an increasing tendency to adopt ASCOM technology at a few professional facilities (see following section). The need for proprietary software is usually driven by the competition for research dollars between institutions.
History of amateur robotic telescopes
In 2004, most robotic telescopes are in the hands of amateur astronomers. A prerequisite for the explosion of amateur robotic telescopes was the availability of relatively inexpensive CCD cameras, which appeared on the commercial market in the early 1990s. These cameras not only allowed amateur astronomers to make pleasing images of the night sky, but also encouraged more sophisticated amateurs to pursue research projects in cooperation with professional astronomers. The main motive behind the development of amateur robotic telescopes has been the tedium of making research-oriented astronomical observations, such as taking endlessly repetitive images of a variable star.
In 1998, Bob Denny conceived of a software interface standard for astronomical equipment, based on Microsoft's Component Object Model, which he called the Astronomy Common Object Model (ASCOM). He also wrote and published the first examples of this standard, in the form of commercial telescope control and image analysis programs, and several freeware components. He also convinced Doug George to incorporate ASCOM capability into a commercial camera control software program. Through this technology, a master control system that integrated these applications could easily be written in perl, VBScript, or JavaScript. A sample script of that nature was provided by Denny.
Following coverage of ASCOM in Sky & Telescope magazine several months later, ASCOM architects such as Bob Denny, Doug George, Tim Long, and others later influenced ASCOM into becoming a set of codified interface standards for freeware device drivers for telescopes, CCD cameras, telescope focusers, and astronomical observatory domes. As a result, amateur robotic telescopes have become increasingly more sophisticated and reliable, while software costs have plunged. ASCOM has also been adopted for some professional robotic telescopes.
Meanwhile, ASCOM users designed ever more capable master control systems. Papers presented at the Minor Planet Amateur-Professional Workshops (MPAPW) in 1999, 2000, and 2001 and the International Amateur-Professional Photoelectric Photometry Conferences of 1998, 1999, 2000, 2001, 2002, and 2003 documented increasingly sophisticated master control systems. Some of the capabilities of these systems included automatic selection of observing targets, the ability to interrupt observing or rearrange observing schedules for targets of opportunity, automatic selection of guide stars, and sophisticated error detection and correction algorithms.
Remote telescope system development started in 1999, with first test runs on real telescope hardware in early 2000. RTS2 was primary intended for Gamma ray burst follow-up observations, so ability to interrupt observation was core part of its design. During development, it became an integrated observatory management suite. Other additions included use of the Postgresql database for storing targets and observation logs, ability to perform image processing including astrometry and performance of the real-time telescope corrections and a web-based user interface. RTS2 was from the beginning designed as a completely open source system, without any proprietary components. In order to support growing list of mounts, sensors, CCDs and roof systems, it uses own, text based communication protocol. The RTS2 system is described in papers appearing in 2004 and 2006.[2]
The Instrument Neutral Distributed Interface (INDI) was started in 2003. In comparison to the Microsoft Windows centric ASCOM standard, INDI is a platform independent protocol developed by Elwood C. Downey of ClearSky Institute to support control, automation, data acquisition, and exchange among hardware devices and software frontends.
Significance
By 2004, robotic observations accounted for an overwhelming percentage of the published scientific information on asteroid orbits and discoveries, variable star studies, supernova light curves and discoveries, comet orbits and gravitational microlensing observations.
All early phase Gamma ray burst observations were carried by robotic telescopes.
List of Robotic Telescopes
See below for further information on these professional robotic telescopes:
- TRAPPIST, 60 cm, La Silla, Chile.
- Super-LOTIS, 60 cm, Steward Observatory on Kitt Peak, Arizona, USA.
- Liverpool Telescope (robotic telescope), 2.0 m, on La Palma, Canary Islands
- Automated Planet Finder, 2.4m, Lick Observatory on Mount Hamilton, California, USA.
- Slooh telescopes, various sizes & locations.
See also
References
External links
- List of professional robotic telescopes (with map and statistics).
- "Robotic telescopes: An interactive exhibit on the world-wide web". CiteSeerX: 10
.1 : provides an overview of telescope operation through the internet.1 .51 .9564
|