Project Data Logger (Camera & IMU)

Hello,

I need some advices about the feasibility of my project. I am currently conducting researches about vision for advanced driver assistance systems.

I would like to make some experimentations with my iPhone X. I am looking for a way to record simultaneously, about 30 Hz, the phone attitude (roll, pitch and yaw angles) and the image of the scene.

Nevertheless, I am not familiar with oriented object programming like swift. I just spent few days to learn the basics. Before going further, I would like to evaluate the feasibility of my project.

I would really appreciate if someone could help me to start. Note that, I reviewed all the commercial app and many codes I found, but none of them did the job.

Thanks,
Pm

It’s totally feasible. A lot of AR applications already do what you’re trying to do.
The difficulty could lie in how much you want to configure the capturing, and to optimize it to smoothly run 30Hz (which shouldn’t be too hard if all you need is to save data for later).

Not sure how to get someone started, but you could see Core Motion about how to get (steam of) Gyroscope data.

Hi Lantua,

Thanks a lot for your advices.
As you suggest, in a first time I would like to process the data (IMU and Images) off-line. Consequently, I just need to capture simultaneously the scene image and the corresponding IMU data.

I spent time to follow some tutorials about Core Motion and I even find great codes on GitHub such as MotionCollector
I also take a look at the AVFundation framework for camera recording.

Regarding my basic skills, I have difficulties to merge the codes.