Academic affiliate
The work will entail:
The NIST Information Technology Lab (ITL) and Engineering Lab (EL) are collaborating on a project for real-time image processing for Additive Manufacturing. To handle real-time constraints, computations on Field Programmable Gate Array (FPGA) devices will need to be enabled, likely involving both traditional Computer Vision algorithms and Deep Learning models.
We plan on instrumenting a hard real-time system that can meet the time sensitive deadlines for detecting sparks from a high-speed camera that is monitoring the interaction between the melt pool and laser. There are three methodologies to consider.
1. The camera contains a built-in FPGA that can process images as they are captured.
2. The capture card has a slightly higher-end FPGA.
3. The capture card can transfer image data into system memory, allowing the host system to process images using either the CPU, GPU, or a combination of both.
To this end, we are seeking a senior Computer Scientist who will supervise development of algorithms to process frames in real-time from a high frame rate camera. The processing algorithms may utilize the camera’s built-in Field Programmable Gate Arrays (FPGA), the capture card’s built-in FPGA, or traditional computer CPUs and GPUs.
Note that the exact number of hours per week will be negotiated once the successful candidate has been identified.
Computer Vision AI models for Additive Manufacturing image processing
- A PhD in Computer Science, Engineering, Manufacturing, or a related field.
- At least 10 years of relevant experience supervising graduate students, or equivalent.
- Familiarity with image analysis algorithms.
- Familiarity with FPGA programming.
- Familiarity with CPU and/or GPU image analysis.
Key responsibilities will include but are not limited to:
- Supervise development of image analysis algorithms that target the highspeed camera’s FPGA.
- Supervise development of image analysis algorithms that target the capture card’s FPGA.
- Supervise development of image analysis algorithms that target the traditional computer’s CPU(s) and GPU(s).
- Supervise measurement of real-time throughput for developed image analysis workflows.
- Supervise creation of AI/Deep learning workflows for training AI models for analyzing images in a series.