{"id":797574,"date":"2025-08-01T08:37:06","date_gmt":"2025-08-01T13:37:06","guid":{"rendered":"http:\/\/spaceweekly.com\/?p=797574"},"modified":"2025-08-01T08:37:06","modified_gmt":"2025-08-01T13:37:06","slug":"cameras-that-work-like-our-eyes-could-give-boost-to-astronomers","status":"publish","type":"post","link":"https:\/\/spaceweekly.com\/?p=797574","title":{"rendered":"Cameras that work like our eyes could give boost to astronomers"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div id=\"\">\n<figure class=\"ArticleImage\">\n<div class=\"Image__Wrapper\"><\/div><figcaption class=\"ArticleImageCaption\">\n<div class=\"ArticleImageCaption__CaptionWrapper\">\n<p class=\"ArticleImageCaption__Title\">The Sirius binary star system photographed with a neuromorphic camera<\/p>\n<p class=\"ArticleImageCaption__Credit\">Satyapreet Singh, Chetan Singh Thakur, Nirupam Roy, Indian Institute of Science<\/p>\n<\/div>\n<\/figcaption><\/figure>\n<\/p>\n<p>Cameras that mimic human eyesight could have key advantages for astronomers, allowing them to capture extremely bright and dim objects in the same image and track fast-moving objects without motion blur.<\/p>\n<p>Traditional digital cameras operate by sampling a grid of pixels many times a second, recording data from every pixel each time. Neuromorphic cameras, also known as event cameras, work very differently. Each pixel is only sampled if the brightness at that spot has changed; if a point on the sensor sees the same brightness as in the previous reading, then no new data is stored. This is similar to how sensory information is collected by the human eye.<\/p>\n<p><span class=\"js-content-prompt-opportunity\"\/><\/p>\n<p>This approach has several benefits: it stores less data for the same video because only changing pixels are recorded, and it can operate at much higher frame rates. On top of this, they can capture extremely dim objects even if they are next to very bright objects that would saturate frames taken on a traditional camera, because the pixels detect photons in a logarithmic scale rather than a linear one.<\/p>\n<p>To explore the potential of this technology for astronomy, Chetan Singh Thakur at the Indian Institute of Science, Bengaluru, and his colleagues installed a neuromorphic camera on a 1.3-metre-mirror telescope and a 20-centimetre-mirror telescope at the Aryabhatta Research Institute of Observational Sciences in Uttarakhand, India.<\/p>\n<p>They were able to clearly capture meteorites passing between Earth and the moon, as well as an image of the Sirius binary system, which consists of Sirius A \u2013 the brightest star in the night sky \u2013 and Sirius B.<\/p>\n<section>\n<\/section>\n<p>Sirius A is about 10,000 times brighter than Sirius B<strong>,<\/strong> which means they could never be captured clearly in a single image with traditional sensors, says Mark Norris at the University of Central Lancashire, UK, who wasn\u2019t involved in the work.<\/p>\n<p>Neuromorphic cameras are also extremely good at detecting fast-moving objects because of their higher frame rate, says Singh Thakur. \u201cYou can really go high speed, like a few kilohertz, and the advantage is if something is moving really fast, you\u2019ll be able to capture it. The normal camera would just give you motion blur.\u201d<\/p>\n<p>Telescopes often have multiple sensors that can be switched in and out as needed, says Norris. Neuromorphic cameras could be another tool in astronomers\u2019 arsenal for situations where you want to look at a very bright object and a very faint object at the same time, or for watching fast-moving objects like the recently discovered interstellar object 3I\/ATLAS, which is racing through our solar system.<\/p>\n<p>Tracking fast-moving objects usually requires either panning the telescope to follow it, which blurs the background and makes precise locations hard to calculate, or letting the object track across the telescope\u2019s field of view over time, which blurs the object itself. But a neuromorphic camera could accurately track the movement of an object at precise points and also retain the background to allow its location to be worked out.<\/p>\n<p>\u201cDo I want to know how bright it is accurately? Or do I want to know where it is?\u2029It\u2019s sort of like the quantum mechanical thing: you can\u2019t know both at the same time,\u201d says Norris. \u201cWell, this, potentially, is how we could know both at the same time.\u201d<\/p>\n<p>But while neuromorphic cameras offer some unique advantages, they aren\u2019t likely to be used for every application. Their resolution tends to be lower than charge-coupled devices (CCD), a type of sensor commonly used in digital cameras, and they capture photons with up to 78 per cent efficiency, compared to 95 per cent for CCDs. This means traditional sensors are more likely to capture an extremely dim object at the limits of detection.<\/p>\n<section class=\"ArticleTopics\" data-component-name=\"article-topics\">\n<p class=\"ArticleTopics__Heading\">Topics:<\/p>\n<\/section><\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.newscientist.com\/article\/2489766-cameras-that-work-like-our-eyes-could-give-boost-to-astronomers\/?utm_campaign=RSS%7CNSNS&#038;utm_source=NSNS&#038;utm_medium=RSS&#038;utm_content=space&#038;rand=772163\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Sirius binary star system photographed with a neuromorphic camera Satyapreet Singh, Chetan Singh Thakur, Nirupam Roy, Indian Institute of Science Cameras that mimic human eyesight could have key advantages&hellip; <\/p>\n","protected":false},"author":1,"featured_media":797575,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[39],"tags":[],"class_list":["post-797574","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-new-scientist"],"_links":{"self":[{"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/posts\/797574","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=797574"}],"version-history":[{"count":0,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/posts\/797574\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=\/wp\/v2\/media\/797575"}],"wp:attachment":[{"href":"https:\/\/spaceweekly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=797574"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=797574"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/spaceweekly.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=797574"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}