Prof. Dr. Ralf Reulke
Profil
Forschungsthemen57
3D Body Analytics
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 09/2018 - 08/2019 Projektleitung: Prof. Dr. Ralf Reulke
Anforderungen an geometrische Fusionsverfahren
Quelle ↗Zeitraum: 08/2006 - 11/2006 Projektleitung: Prof. Dr. Ralf Reulke
Aufbau und Untersuchung eines Versuchsaufbaus für Quanteneffizienzmessungen im UV-Bereich mit diversen optischen Bauelementen (Strahlungsquellen, spektrale Filter, etc.).
Quelle ↗Zeitraum: 07/2017 - 12/2017 Projektleitung: Prof. Dr. Ralf Reulke
Auswertung und Visualisierung von HDAC-Daten
Quelle ↗Zeitraum: 01/2006 - 09/2006 Projektleitung: Prof. Dr. Ralf Reulke
Automatisierte Fahrzeugerkennung und Verkehrszählung
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 07/2010 - 10/2010 Projektleitung: Prof. Dr. Ralf Reulke
Automatisierte Fahrzeugerkennung und Verkehrszählung 1
Quelle ↗Zeitraum: 11/2010 - 11/2010 Projektleitung: Prof. Dr. Ralf Reulke
Automatisierte Fahrzeugerkennung und Verkehrszählung 2
Quelle ↗Zeitraum: 10/2010 - 10/2010 Projektleitung: Prof. Dr. Ralf Reulke
Beiträge zu Detection, Localization, Recognition, Identification und Tracking of 3D (man-made, industry-oriented) Objects in Real World Scenes
Quelle ↗Zeitraum: 02/2015 - 09/2015 Projektleitung: Prof. Dr. Ralf Reulke
Bewertung des 72h Viedeostreams zur Verteilung der Fahrgäste in 4 Wagenkästen und Synchronisierung mit den Umlauf- und Zähldaten aus dem AFZ-System mit Unterstützung von Interautomation
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 12/2015 - 01/2016 Projektleitung: Prof. Dr. Ralf Reulke
Canonic 3D/CAD Annotation
Quelle ↗Zeitraum: 06/2015 - 09/2015 Projektleitung: Prof. Dr. Ralf Reulke
Datenanalysen für Firebird
Quelle ↗Zeitraum: 10/2016 - 12/2016 Projektleitung: Prof. Dr. Ralf Reulke
DELIOS
Quelle ↗Förderer: Land Berlin - Andere Zeitraum: 07/2005 - 12/2007 Projektleitung: Prof. Dr. Ralf Reulke
Design & Implementierung eines neuronalen Netzwerks für die Personendetektion (Transferbonus)
Quelle ↗Zeitraum: 11/2017 - 02/2018 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung, Bereitstellung und Test von OGSE im Projekt CHEOPS
Quelle ↗Zeitraum: 07/2015 - 01/2016 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung einer autonom funktionsfähigen Systemlösung zur Früherkennung von Stress- und Schmerzzuständen bei Pferden auf der Basis einer multifaktoriellen Analyse von Vitalparametern und Verhaltensmustern
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 06/2017 - 05/2020 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung einer Systemlösung zur Online-Sicherheitsüberwachung im schienengebundenen Personennahverkehr auf der Basis der automatisierten Analyse und Bewertung von 3D-Videodaten
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 09/2011 - 05/2014 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung eines automatischen Messplatzes zur Kalibrierung von Kamerasystemen; Entwicklung von Verfahren zur Kalibrierung von Kamerasystemen
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 07/2011 - 12/2013 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung eines technischen Anforderungskataloges für die Implementierung von kamerabasierten Systemen in Fahrzeuge des SPNV
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 09/2013 - 02/2014 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung und Bereitstellung von Test und Verarbeitungsmöglichkeiten für Grace (EGSE & OGSE)
Quelle ↗Zeitraum: 05/2014 - 10/2014 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung und Bereitstellung von Test und Verarbeitungsmöglichkeiten für Grace (EGSE & OGSE)
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 05/2014 - 10/2014 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung und Bereitstellung von Test und Verarbeitungsmöglichkeiten für Sentinal-4 (EGSE & OGSE)
Quelle ↗Zeitraum: 07/2015 - 01/2016 Projektleitung: Prof. Dr. Ralf Reulke
Entwicklung und Test für „Prüfmodule / Lagedaten-Erfassung“
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 06/2015 - 09/2015 Projektleitung: Prof. Dr. Ralf Reulke
Erweiterung Strahlungquelle und Kollimator
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 07/2015 - 08/2015 Projektleitung: Prof. Dr. Ralf Reulke
EXIST-Gründerstipendium: VINS
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 03/2017 - 02/2018 Projektleitung: Prof. Dr. Ralf Reulke
Geometrische Fusionsverfahren
Quelle ↗Zeitraum: 06/2009 - 12/2009 Projektleitung: Prof. Dr. Ralf Reulke
Geometrische Fusionsverfahren
Quelle ↗Zeitraum: 05/2007 - 12/2007 Projektleitung: Prof. Dr. Ralf Reulke
Implementierung von Bildverarbeitungsverfahren
Quelle ↗Zeitraum: 06/2006 - 12/2007 Projektleitung: Prof. Dr. Ralf Reulke
Implementierung von OpenCV in Cassandra
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 10/2012 - 12/2012 Projektleitung: Prof. Dr. Ralf Reulke
Integration einer Lösung für "Seat Occupancy Detection" in ein Siemens System und Erarbeitung einer Fusionslösung
Quelle ↗Zeitraum: 06/2015 - 07/2015 Projektleitung: Prof. Dr. Ralf Reulke
International Workshop on Combinatorial Image Analysis 2006 (Veranstaltung: 19.06.-21.06.06, Berlin)
Quelle ↗Zeitraum: 03/2006 - 12/2007 Projektleitung: Prof. Dr. Ralf Reulke
IQ wireless
Quelle ↗Zeitraum: 01/2015 - 06/2015 Projektleitung: Prof. Dr. Ralf Reulke
Kalibrierung von Kameras und Entwicklung eines Cassandra-Moduls
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 05/2011 - 05/2011 Projektleitung: Prof. Dr. Ralf Reulke
Kallibrations- und Testaufgaben
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 07/2015 - 09/2015 Projektleitung: Prof. Dr. Ralf Reulke
Mehrarbeit im Projekt "Grace"
Quelle ↗Zeitraum: 02/2016 - 03/2016 Projektleitung: Prof. Dr. Ralf Reulke
MELA-Screen: Entwicklung eines optischen Systems für eine 3D Ganzkörperuntersuchung zur Umsetzung eines routinetauglichen und automatisierten Screenings für die Hautkrebsfrüherkennung
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 09/2018 - 10/2021 Projektleitung: Prof. Dr. Ralf Reulke
Mullti-Sensor-System
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 07/2009 - 03/2011 Projektleitung: Prof. Dr. Ralf Reulke
Multiparametersystem für das Schmerzmonitoring bei Neugeborenen; Entwicklung von 3D-Modellen und Trackingalgorithmen für die Bewegungsanalyse
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 09/2014 - 03/2017 Projektleitung: Prof. Dr. Ralf Reulke
Object-Detection and Classification am Institut für Optische Sensorsysteme
Quelle ↗Zeitraum: 01/2015 - 06/2015 Projektleitung: Prof. Dr. Ralf Reulke
Objekt- und Veränderungsdetektion aus optischen und Radardaten
Quelle ↗Zeitraum: 03/2013 - 12/2014 Projektleitung: Prof. Dr. Ralf Reulke
Orientierungsbestimmung mit Smartphone Sensoren
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 05/2010 - 06/2010 Projektleitung: Prof. Dr. Ralf Reulke
Panomaric Photogrammetry Workshop (Veranstaltung: 24.02.-25.02.05, Berlin)
Quelle ↗Zeitraum: 02/2005 - 12/2006 Projektleitung: Prof. Dr. Ralf Reulke
PAN Sharpening
Quelle ↗Zeitraum: 10/2004 - 12/2005 Projektleitung: Prof. Dr. Ralf Reulke
Performance-Analyse
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 08/2009 - 02/2010 Projektleitung: Prof. Dr. Ralf Reulke
Performancemessung für Sentinel 4 FPA
Quelle ↗Zeitraum: 06/2016 - 12/2016 Projektleitung: Prof. Dr. Ralf Reulke
ProAnimalLife - RightRiding / Entwicklung des Hardwarekonzepts und von Algorithmen für die Bilddatenauswertung des Reitverhaltens
Quelle ↗Förderer: BMWE: ZIM Zeitraum: 10/2019 - 12/2024 Projektleitung: Prof. Dr. Ralf Reulke
ProAnimalLife - VitaCheck / Optisches Monitoringsystem zur Analyse des Bewegungsverhaltens und der Hauttextur
Quelle ↗Förderer: Bundesministerium für Wirtschaft und Energie Zeitraum: 05/2018 - 10/2020 Projektleitung: Prof. Dr. Ralf Reulke
Radiometrische Kalibrierung
Quelle ↗Zeitraum: 11/2005 - 10/2006 Projektleitung: Prof. Dr. Ralf Reulke
Schnelles Photolumineszenz-Messverfahren für die Inspektion von Solarmodulen mittels Deep-Depletion CCD Kameras
Quelle ↗Förderer: Land Berlin - Andere Zeitraum: 01/2009 - 06/2011 Projektleitung: Prof. Dr. Ralf Reulke
Seat Occupancy Detection
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 04/2015 - 05/2015 Projektleitung: Prof. Dr. Ralf Reulke
Segmentierung der Fussballbildes: Publikum, Werbung am Rand, Spielplatz (Maskierung + Features im Publikum oder Spieler) Transferbonus
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 11/2015 - 03/2016 Projektleitung: Prof. Dr. Ralf Reulke
Softwareentwicklung zum verbesserten Einsatz von Kamerasystemen
Quelle ↗409-02-A · SoftwaretechnikZeitraum: 09/2006 - 04/2007 Projektleitung: Prof. Dr. Ralf Reulke
Strahlungsquelle für radiometrische und spektrale Kalibrierung einer hyperspektralen Kamera
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 10/2014 - 01/2015 Projektleitung: Prof. Dr. Ralf Reulke
Untersuchung des Referenz-Systems zur Bewertung des DELIOS-Systems
Quelle ↗Zeitraum: 11/2007 - 12/2007 Projektleitung: Prof. Dr. Ralf Reulke
Validierung von Algorithmen für das Stereomatching
Quelle ↗Förderer: Bundesministerium für Forschung, Technologie und Raumfahrt Zeitraum: 05/2010 - 10/2010 Projektleitung: Prof. Dr. Ralf Reulke
Verbesserung der Lagebestimmung von Kameraplattformen / Transferbonus
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 02/2015 - 05/2015 Projektleitung: Prof. Dr. Ralf Reulke
Verifikation eines Personenzählers
Quelle ↗Förderer: Wirtschaftsunternehmen / gewerbliche Wirtschaft Zeitraum: 12/2014 - 03/2015 Projektleitung: Prof. Dr. Ralf Reulke
Workshop: Low-Cost 3D - Sensoren, Algorithmen, Anwendungen (6. und 7. Dezember 2011 in Berlin)
Quelle ↗Zeitraum: 10/2011 - 12/2011 Projektleitung: Prof. Dr. Ralf Reulke
Mögliche Industrie-Partner10
Stand: 26.4.2026, 19:48:44 (Top-K=20, Min-Cosine=0.4)
- 63 Treffer85.0%
- Design & Implementierung eines neuronalen Netzwerks für die Personendetektion (Transferbonus)K85.0%
- Design & Implementierung eines neuronalen Netzwerks für die Personendetektion (Transferbonus)
NVIDIA GmbH
PT90 Treffer61.7%- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)P61.7%
- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)
- 92 Treffer61.7%
- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)P61.7%
- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)
- 134 Treffer61.7%
- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)P61.7%
- EU: Bottom-Up Generation of atomicalLy Precise syntheTIc 2D MATerials for High Performance in Energy and Electronic Applications – A Multi-Site Innovative Training Action (ULTIMATE)P55.1%
- EU: Simulation in Multiscale Physical and Biological Systems (STIMULATE)
- 29 Treffer61.4%
- Zuwendung im Rahmen des Programms „exist – Existenzgründungen aus der Wissenschaft“ aus dem Bundeshaushalt, Einzelplan 09, Kapitel 02, Titel 68607, Haushaltsjahr 2026, sowie aus Mitteln des Europäischen Strukturfonds (hier Euro-päischer Sozialfonds Plus – ESF Plus) Förderperiode 2021-2027 – Kofinanzierung für das Vorhaben: „exist Women“T61.4%
- Zuwendung im Rahmen des Programms „exist – Existenzgründungen aus der Wissenschaft“ aus dem Bundeshaushalt, Einzelplan 09, Kapitel 02, Titel 68607, Haushaltsjahr 2026, sowie aus Mitteln des Europäischen Strukturfonds (hier Euro-päischer Sozialfonds Plus – ESF Plus) Förderperiode 2021-2027 – Kofinanzierung für das Vorhaben: „exist Women“
- 68 Treffer60.4%
- DFG-Sachbeihilfe: Aufmerksamkeit und sensorische Integration im aktiven Sehen von bewegten ObjektenP60.4%
- SFB 1315/2: Mechanismen und Störungen der Gedächtniskonsolidierung: Von Synapsen zur SystemebeneP55.1%
- DFG-Sachbeihilfe: Aufmerksamkeit und sensorische Integration im aktiven Sehen von bewegten Objekten
- 91 Treffer60.1%
- Interfaces in opto-electronic thin film multilayer devicesP60.1%
- Interfaces in opto-electronic thin film multilayer devices
- 135 Treffer60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)P60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)
- 114 Treffer60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)P60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)
- 126 Treffer60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)P60.0%
- EU: Context Sensitive Multisensory Object Recognition (HBP)
Publikationen25
Top 25 nach Zitationen — Quelle: OpenAlex (BAAI/bge-m3 embedded für Matching).
elib (German Aerospace Center) · 591 Zitationen
Space Science Reviews · 252 Zitationen · DOI
Science · 125 Zitationen · DOI
Neutral oxygen in the saturnian system shows variability, and the total number of oxygen atoms peaks at 4 x 10(34). Saturn's aurora brightens in response to solar-wind forcing, and the auroral spectrum resembles Jupiter's. Phoebe's surface shows variable water-ice content, and the data indicate it originated in the outer solar system. Saturn's rings also show variable water abundance, with the purest ice in the outermost A ring. This radial variation is consistent with initially pure water ice bombarded by meteors, but smaller radial structures may indicate collisional transport and recent renewal events in the past 10(7) to 10(8) years.
Sensors · 113 Zitationen · DOI
Whether for identification and characterization of materials or for monitoring of the environment, space-based hyperspectral instruments are very useful. Hyperspectral instruments measure several dozens up to hundreds of spectral bands. These data help to reconstruct the spectral properties like reflectance or emission of Earth surface or the absorption of the atmosphere, and to identify constituents on land, water, and in the atmosphere. There are a lot of possible applications, from vegetation and water quality up to greenhouse gas monitoring. But the actual number of hyperspectral space-based missions or hyperspectral space-based data is limited. This will be changed in the next years by different missions. The German Aerospace Center (DLR) Earth Sensing Imaging Spectrometer (DESIS) is one of the new currently existing space-based hyperspectral instruments, launched in 2018 and ready to reduce the gap of space-born hyperspectral data. The instrument is operating onboard the International Space Station, using the Multi-User System for Earth Sensing (MUSES) platform. The instrument has 235 spectral bands in the wavelength range from visible (400 nm) to near-infrared (1000 nm), which results in a 2.5 nm spectral sampling distance and a ground sampling distance of 30 m from 400 km orbit of the International Space Station. In this article, the design of the instrument will be described.
ISPRS Journal of Photogrammetry and Remote Sensing · 111 Zitationen · DOI
elib (German Aerospace Center) · 82 Zitationen
Digital imagery from satellites or multispectral and hyperspectral scanners is well accepted. Tremendous challenges are inherent in the development of a digital sensor to acquire imagery suitable for both high precision photogrammetric mapping and image processing for interpretative purposes. The performance of the film aerial camera is almost impossible to reach with current digital technology. Joint development work by LH Systems and Deutsches Zentrum fur Luftund Raumfahrt (German Aerospace Centre, DLR) has led to considerable success using forward-, nadirand backward-looking linear arrays on the focal plane to provide panchromatic imagery and geometric information, supplemented by further arrays for multispectral data. This has culminated in the ADS40 product. In the paper, all essential components of the ADS40 are addressed: optics, filters, CCD type and configuration, frontend electronics, computer, flight management and sensor control software, mass memory, and attitude and position measurement system. The imagery from the new sensor will fulfil many market requirements between the highest resolution film imagery (<0.15 m) and high-resolution space imagery (1-10 m). The sensor’s unique blend of multispectral information with high quality geometric information will give rise to numerous new applications.
elib (German Aerospace Center) · 50 Zitationen
The management and planning of forests presumes the availability of up-to-date informaion on their current state. The relevant parameters like tree species, diameter of the bowl in defined heights an positions are usually represented by a forest inventory. In order to allow the collection of these inventory parameters, an approach aiming on the integration of a terrestrial laser scanner and a high resolution panoramic camera has been developed. The integration of these sensors provides geometric information from distance measurement and high resolution radiometric information from the panoramic images. In order to enable a combined evaluation, in the first processing step a soregistration of both data sets is required. Afterwards geometric quantities like position an diameter of trees can be derived from the LIDAR data, where as texture parameters as derived from the high resolution panoramic imagery can be applied for tree species recogition.
ISPRS Journal of Photogrammetry and Remote Sensing · 46 Zitationen · DOI
elib (German Aerospace Center) · 40 Zitationen
Typical devices for acquiring distance data are laser scanners. Combining these with higher resolution image data is state of the art. Recently, camera systems are available providing distance and image data without usage of mechanical parts. Two manufacturers of such camera systems with a typical resolution of 160 x 120 pixels are CSEM Swiss Ranger and PMDTechnologies GmbH. This paper describes a design, combining a PMD and a higher resolution RGB camera. For error-free operation, calibration of both cameras and alignment determination between both systems is necessary. This way, a performance determination of the PMD system is also possible.
Pattern Recognition Letters · 38 Zitationen · DOI
Lecture notes in computer science · 32 Zitationen · DOI
29 Zitationen · DOI
physica status solidi (a) · 26 Zitationen · DOI
Using a Gaussian approximation for the depth distribution of the energy dissipation function for electron bombardment, the electron beam induced barrier current (BC) is determined as a function of the position on a surface perpendicular to the barrier. Comparison of theory and experiment for the energy and position dependence of BC provided values for the diffusion length and the surface recombination velocity of excess carriers. The experiments are performed in a wide range of diffusion lengths and dopand concentration on Si and GaP for primary energies of electrons between 10 and 30 keV. The parameters for the Gaussian depth distribution are estimated for Si and GaP and compared with values taken from the literature. Unter Ausnutzung einer Gaußfunktion für die Tiefenverteilung des Energieverlustes der eingeschossenen Elektronen wird der elektronenstrahlinduzierte Barrierenstrom (BS) als Funktion des Ortes auf einer Fläche senkrecht zur Barriere berechnet. Der Vergleich von Theorie und Experiment für die Energie- und Ortsabhängigkeit des BS liefert die Werte für die Diffusionslänge und die Oberflächenrekombination der Überschußladungsträger. Die Experimente werden in einem weiten Bereich der Diffusionslängen und Dotandenkonzentration an Si und GaP für Primärenergien der Elektronen zwischen 10 bis 30 keV durohgeführt. Die Parameter für die Gaußsche Tiefenverteilung werden für Si und GaP bestimmt und mit den Werten aus der Literetur verglichen.
elib (German Aerospace Center) · 24 Zitationen
Airborne linear array sensors present new challenges for photogrammetric software. The push-broom nature of these sensor systems has the potential for very high quality images, but these are heavily influenced by the dynamics of the aircraft during acquisition. Fortunately, highly precise position and attitude measurements have become possible, using today's inertial measuring units (IMUs). This allows image restoration to the sub-pixel level. The sensor discussed in detail here is a three-line camera with additional multispectral lines. The three lines are one looking forward, one in the nadir position and one looking backward with respect to the flight path. Extensive software processes are necessary to produce traditional photogrammetric products from a push-broom airborne sensor. The first steps of the ground processing flow are off-loading imagery and supporting data from the mass memory system of the sensor, post-processing of GPS/IMU data and image rectification into stereo-viewable and matchable form. After this processing, the images can be used similarly to classical aerial photography. This includes semi-automated triangulation with and without ground control, DTM production from multiple stereo views, vector extraction in mono and stereo, and orthophoto and mosaic production. The paper analyses the differences to classical photogrammetric processing for all processing steps and closes with a discussion of the advantages and disadvantages of this new type of photogrammetric imagery.
elib (German Aerospace Center) · 20 Zitationen
Caused by the rising interest in traffic surveillance for simulations and decision management many publications concentrate on automatic vehicle detection or tracking. Quantities and velocities of different car classes form the data basis for almost every traffic model. \nEspecially during mass events or disasters a wide-area traffic monitoring on demand is needed which can only be provided by airborne systems. This means a massive amount of image information to be handled. In this paper we present a combination of vehicle detection and tracking which is adapted to the special restrictions given on image size and flow but nevertheless yields reliable information about \nthe traffic situation. \nCombining a set of modified edge filters it is possible to detect cars of different sizes and orientations with minimum computing effort, if some a priori information about the street network is used. The found vehicles are tracked between two consecutive images by an algorithm using Singular Value Decomposition. Concerning their distance and correlation the features are assigned pairwise with respect to their global positioning among each other. Choosing only the best correlating assignments it is possible to compute reliable \nvalues for the average velocities.
18 Zitationen · DOI
Improvement of Spatial Resolution with Staggered Arrays As Used in The Airborne Optical Sensor Ads40
2004elib (German Aerospace Center) · 17 Zitationen
Using pushbroom sensors onboard aircrafts or satellites requires, especially for photogrammetric applications, wide image swaths with a high geometric resolution. One approach to satisfy both demands is to use staggered line arrays, which are constructed from two identical CCD lines shifted against each other by half a picel in line direction. Practical applications of such arrays in remote sensing include SPOT, and in the commercial environment the Airborne Digital Sensor, or ADS40, from Leica Geosystems. Theoretically, the usefulness of staggered arrays depends from spatial reslution, which is defined by the total point spread function of the imaging system and Shannon's sampling theorem. Due to the two shifted sensor lines staggering results in a double number of sampling points perpendicular to the flight direction. In order to simultaneously double the sample number in the flight direction, the line readout rate, or integration time, has to produce half a pixel spacing on ground. Staggering in combination with a high-resolution optical system can be used to fulfil the sampling condition, which means that no spectral components above the critical spatial frequency 2/D are present. Theoretically, the resolution is as good for a non-staggered line with half pixel size D/2, but radiometric dynamics should be twice as high. In practice, the slightly different viewing angle of both lines of a staggered array can result in a deteration of image quality due to aircraft motion, attitude fluctuations or terrain undulation. Fulfilling the sampling condition further means that no aliasing occurs. This is essential for the image quality in quasiperiodical textured image areas and for photogrammetric sub-pixel accuracy. Furthermore, image restoration methods for enhancing the image quality can be applied more efficently. The panchromatic resolution of the ADS40 opties is optimised for image collection by a staggered array. This means, it transfers spatial frequencies of twice the Nyquist frequency of its 12k sensors. First experiments, which were carried out some years ago, indicated alrady a spatial resolution improvement by using image restitution the ADS 40 staggered 12k pairs. The results of the restitution algorithm, which is integrated in the ADS image processing flow, has now been analysed quantitatively. This paper presents the theory of high resolution image restitution from staggered lines and practical results with ADS40 high resolution panchromatic images and high resolution colour images, created by sharpening 12k colour images with high resolution pan-chromatic ones.
16 Zitationen · DOI
Cohen's κ coefficient has been widely used for assessing classification results derived from remote sensing data. It however presents several limitations, which are preventing both an efficient use as well as a generalisation of its use. This paper reviews these problems and proposes as an alternative to prefer the Krippendorff's α-coefficient over Cohen's κ. Krippen-dorff's α indeed presents less flaws while dealing with more data types — hence allowing the rating of quantitative data -and managing the case where more than two judgements are issued and finally dealing with cases where no judgement is issued. These concepts are be illustrated by some exemplary data-sets.
elib (German Aerospace Center) · 16 Zitationen
The process of calibration is a prerequisite for each computer vision system. Calibration involves calculating both intrinsic and extrinsic parameters of the camera. While intrinsic parameters (focal length, principal point, etc.) are usually fixed, extrinsic ones (position and angles of the camera) have to be determined when the camera moves in relation to world coordinates. The calibration of the extrinsic parameters is usually performed with help of some reference objects or known measured points (GCP’s) in the scene. In the case of a vehicle camera, where the coordinates refer to vehicle coordinates, not the extrinsic calibration but the alignment of the camera to the Inertial Measurement Unit (IMU) is necessary. This paper proposes a solution for determining the orientation of a vehicle camera in relation to the vehicle. This novel approach escapes from tedious laboratory setups and reference measurements. It benefits from a known property of the road’s infrastructure, namely the parallelism of the road markers. For this reason lane markers are detected and transformed through a fast perspective removal (FPR) to an orthographic perspective. Newton’s Method is used for searching an optimal parameter set for this transformation. The algorithm works under the assumptions that the calibration is performed when driving on a straight and flat segment and the lane markers are visible. It reaches very good performance (via parametrical instead of image transformations) and good accuracy for lateral detection of features in automotive applications (for depth information, the algorithm must be improved).
Icarus · 15 Zitationen · DOI
Lecture notes in computer science · 14 Zitationen · DOI
elib (German Aerospace Center) · 14 Zitationen
OIS is a new Optical Information System for road traffic observation and management. The complete system architecture from the sonsor for automatic traffic detection up to the traffic light management for a wide area is designed under the requirements of an interlligent transportation system. Particular features of this system are the vision sensors with intergrated computational and real-time capabilities, real-tim algorithms for image processing and a new approach for dynamic traffic light management for a single intersection as well as for a wide area. The developed real-time algorithms for image processing extract traffic data even at night and under bad weather conditions. This approach opens the opportunity to identify and specify each traffic object, its location, its speed and other important object information. Furthermore the algorithms are able to identify accidents, and non-motorized traffic like pedestrians and bicyclists. Combining all these single information the system creates new derivate and consolidated information. This leads to a new and more complete view on the traffic situation of an intersection.Only by this a dynamic and near real-time traffic light management is possible. To optimize a wide area traffic management it is necessary to improve the modelling and forecasting of traffic flow. Therefore the information of the current Origin-Destination (OD) flow is essentially. Taking this into account OIS also includes an approach for anonymous vehicle recognition. This approach is based on single object characteristics, order of objects and forecast information, which will be obtained from intersection to intersection.
Lecture notes in computer science · 13 Zitationen · DOI
elib (German Aerospace Center) · 13 Zitationen
Using pushbroom sensors onboard aircrafts or satellites one needs - especially for photogrammetric applications - broad image swaths and high geometrical resolution together. To accomplish both demands staggered line arrays can be used. A staggered line array consists of two identical CCD lines with one shifted half a pixel with respect to the other. Today such sensors are available in the VIS/NIR spectral region with 2 x 12000 pixels (staggered CCD line arrays) and in the mid and thermal infrared with 2 x 512 pixels (staggered MCT line arrays). These sensors will be applied in spaceborne remote sensing missions (small satellite BIRD and International Space Station project FOCUS, both for hot spot detection and evaluation), in experimental airborne systems (with the Infrared Airborne Camera HSRS of the Institute of Space Sensor Technology and Planetary Exploration) and, most important, in the first commercial Airborne Digital Camera (ADC, new name: ADS40) of LH Systems. The paper presents the theory, some results of the simulation, examples with real imaging systems in the laboratory and on aircrafts.
elib (German Aerospace Center) · 12 Zitationen
Non-intrusive video-detection for traffic flow observation and surveillance is the primary alternative to conventional inductive loop detectors. Video Image Detection Systems (VIDS) can derive traffic parameters by means of image processing and pattern recognition methods. Existing VIDS emulate the inductive loops. We propose a trajectory based recognition algorithm to expand the common approach and to obtain new types of information (e.g. queue length or erratic movements). \nDifferent views of the same area by more than one camera sensor is necessary, because of the typical limitations of single camera systems, resulting from the occlusion effect of other cars, trees and traffic signs. A distributed cooperative multi-camera system enables a significant enlargement of the observation area. The trajectories are derived from multi-target tracking. The fusion of object data from different cameras is done using a tracking method. \nThis approach opens up opportunities to identify and specify traffic objects, their location, speed and other characteristic object information. The system provides new derived and consolidated information about traffic participants. Thus, this approach is also beneficial for a description of individual traffic participants. \nKeywords: Multi-camera system, fixed-viewpoint camera, cooperative distributed vision, multi-camera orientation, multi-target tracking \n
Kooperationen1
Bestätigte Forscher↔Partner-Paare aus HU-FIS — Gold-Standard-Positive für das Matching.
Design & Implementierung eines neuronalen Netzwerks für die Personendetektion (Transferbonus)
company
Stammdaten
Identität, Organisation und Kontakt aus HU-FIS.
- Name
- Prof. Dr. Ralf Reulke
- Titel
- Prof. Dr.
- Fakultät
- Mathematisch-Naturwissenschaftliche Fakultät
- Institut
- Institut für Informatik
- Telefon
- +49 30 67055518
- HU-FIS-Profil
- Quelle ↗
- Zuletzt gescrapt
- 26.4.2026, 01:10:56