I keep looking for ways to do things right. For calling some function of my activity, I’d pass activity to that class and call that particular method. I did not feel right doing this and I wanted to make my class more generic so I started looking for better approaches. I found this approach, where I can pass any function as reflect.Method and invoke it in that class. This seemed perfect, but as it turns out this also has some issues of its own.
We launched LenX last week and got a lot of feedback. One of them was to include a timer utility to set the exposure time. I looked up a bit and found out that the best way to do this was to use Handler and create a separate thread.
So, finally, after 2 months of grappling with Android development and studying optimization of image processing on phones & photography concepts, me & my friends have finally bought our idea to fruition.
I have been working on LenX for few weeks now, and as it is a camera app, it was necessary to add some type of focus. We decided to add touch based focus. Wherever user touches the screen, Android camera would try to focus in that area.
I have been working on an app which produces long exposure effect. As of now, we are almost ready with the free version of the app. For this, I got some feedback from some of the users. One of them said that we should make the action bar transparent and since it’s a camera app, it’d look great if it’s interface looked similar to that of Google Camera app. We are nowhere near that app, but we thought of giving it a try. Plus, making the Action Bar transparent gives more space for the camera preview.
Recently, I have started working with Android extensively. It hasn’t been easy. I had to figure out a lot of stuff and that too simultaneously. How UI works, how threads run, how to process frames, etc. In my previous post, I have shown how to process each frame onPreviewFrame callback. In this post, I am going to write about using asynchronous threads to do image processing so that workload of ui thread reduces.
I am currently working on android application which takes a video file and processes each frame to create a long exposure effect. If you have worked with OpenCV on Android, you’d be aware of the face that OpenCV does not support FFmpeg on android yet. To do this one needs to manually build FFmpeg for android.
I have been working on this android application for long exposure shots and want to put it on Google Play Store. Since I have included OpenCV, the application needs OpenCV Manager to run. Now, this can be demotivating for the users to download another app to run my app. So I looked around and there’s a way to do the static linking of OpenCV modules. And, it’s pretty easy too.
I have been working on an android application for some time now. It’s basically about getting long exposure shots. With not much experience with Android, I just started working with OpenCV’s CamerBridgeViewBase. Everything was working fine. The only issue was that it was horribly slow. I hardly ever got more than 9 fps. I once tried OpenCV static linking in order to avoid downloading OpenCV’s Android Manager, the results got pathrtically slower. I never even got 3 fps. I use a Nexus 4. I posted this issue on Stackoverflow as well as OpenCV’s forum and yet haven’t got any reply.
I was working on a small algorithm and it took a while to do the complete processing so I thought of using POSIX threads for multithreading where I failed horribly. I spent some good amount of time on it, but realized maybe it needed a bit more. I knew that OpenCV has TBB support. I started to look for small examples which would help me learn how to use OpenCV’s TBB API. Examples were really hard to find, but thankfully, I found this post on OpenCV Forum - How To Use parallel_for.