Showing posts with label jjmpeg. Show all posts
Showing posts with label jjmpeg. Show all posts

Monday 25 December 2017

jjmpeg 3.0 released

Put enough together to push out a release of jjmpeg.

It ended up 1700 lines of Java, 2000 lines of C, and 300 lines of Perl.

Apart from supporting the latest version of FFmpeg (at least when I started a couple of weeks ago), it's smaller, cleaner, and more complete than any previous version. Having said that this is essentially just a beta release.

This one is licensed GNU General Public License Version 3 (or later).

I've kinda had enough for the moment so it's a pretty bare home page, but it's there.

Merry XMAS!

Friday 22 December 2017

damned enums

Been a long week but i'm finally done with work for another year. Although it's mostly a long week because of the late nights working on jjmpeg ...

One of the things I did was fill out/sync up the important enums - AVCodecID, AVPixelFormat, AVSampleFormat, and so on. Previously the pixel format and sample formats were also Java enums - which can be convenient at times and provides some more (albeit much much overvalued) 'type safety'.

This was fairly easy because the PixelFormat was a simple densely ordered C enum so i could map between the two with a simple +-1. Unfortunately someone decided to add a big hole in the middle of it sometime between 0.10 and 3.4, ... so I gave up and just converted it to a class holding static final int's, and to make it consistent I did that with the other enumerations as well. It doesn't really make the classes any harder to use and improves the class size and memory footprint. I just added some methods to access libav*'s metadata information so I can still map between string representations and so on.

I had to add a small compilation stage which extracts the enums from the header files and converts them to a C file which when compiled and run produces the Java source ... this seemed the absolute shortest path to ensuring I got accurate numbers based on the ffmpeg build configuration.

So after about a weeks worth of solid work it's grown somewhat (about 2KLOC Java, and 2KLOC C, counting lines with ";{}") and the TODO list is getting pretty short.

I would like to clean up the exception design a bit - unfortunately i'm just not very good at that (who is?) but i'd like to get better. The build system is clean and simple but could be improved and needs to include the aforementioned enum stuff, a dist target and versioning. Logging would be nice (both redirect ffmpeg to java.util.logging and some for jjmpeg itself). JJMediaWriter? Fix the license headers, add at least a README.

Not today though, today I drink.

Tuesday 19 December 2017

jjmpeg, jni, javafx

So I guess the mood took me, I somehow ended poking away until the very late morning hours (4am) the last couple of nights hacking on jjmpeg. Just one more small problem to solve ... that never ended. Today I should've been working but i've given up and will write it off, it's nearly xmas break anyway so there's no rush, and i'm ahead of the curve anyway.

JJMediaReader

I got this ported over and playing video fairly easily, and then went through on a cleanup spree. I removed all the BufferedImage, multi-buffering, and scaling stuff and a few other experiments which never worked. Some api changes allowed me to consolidate more code into a base class, and some changes to AVStream necessitated a different approach to initialising the AVCodecContext (using AVCodecParameters). I made a few other little tweaks on the way.

The reason I removed the BufferedImage code is because I didn't want to pollute it with "platform specific" code. i.e. swing, javafx, etc. I've moved that functionality into a separate namespace (module?).

My first cut just took the BufferedImage code and put it into another class which provides the functionality by taking the current AVFrame from the JJMediaReader video stream. This'll probably do but when working on similar functionality for JavaFX I took a completely different approach - implementing a native PixelReader() so that the native code can decide the best way to write to the buffer. This is perhaps a little more work but is a lot cleaner to use.

swscale

jjmpeg1 lets you scale images 'directly' to/from primitive arrays or direct ByteBuffers in addition to AVFrame. Since they have no structure description (size, format), this either has to be passed in to the functions (messy) or stored in the object (also messy). jjmpeg1 used the latter option and for now I simply haven't implemented them.

The PixelReader mentioned above does implement it internally but for code re-use it might make sense to implement them with the structure information as explicit parameters, and use higher level objects such as PixelReader/Writer to track such information. On the other hand the native code has access to more information so it also makes sense to leave it there.

I went a bit further and created a re-usable super-class that does most of the work and toolkit specific routines only have to tweak the invocation. This approach hides libswscale behind another api. The slice conversions don't work properly but they're not necessary.

jni

So far I had public constructors and `finalisers' because otherwise the reflection code failed. That's a bit too ugly (and `dangerous') so I made them private. The reflection code just had to look up the methods and set them Accessible.

    Constructor cc = jtype.getDeclaredConstructor(Long.TYPE);

    cc.setAccessible(true);

    return cc.newInstance(p);

Whilst working on JJMediaReader I hit a snag with the issue of ownership. In most cases objects are either created anew and released (or gc'd) by the Java code, or are simply references to data managed elsewhere. I was addressing the latter problem by simply having an empty release() method for the instance, but that isn't flexible enough because some objects are created or referenced the the context determines which.

So I expanded the Java-side object tracking to include a `refer' method in addition to the 'resolve' method. `resolve' either creates a new instance or returns and existing one with a weak-reference object which will invoke the static release method when it gets finalised. `refer' on the other hand does the same thing but uses a different weak-reference object which does nothing.

I then noticed (the rather obvious) that if an object is created, it can't possibly 'go away' from the object tracking if it is still alive; therefore the `resolve' method was doing redundant work. So I created another `create' method which assumes the object is always a new one and simply adds it to the table. It can also do some checking but i'm pretty sure it can't fail ...

If on the other hand the underlying data was reference counted then the `resolve' method would be useful since it would be possible to lookup an existing object despite it being `released'. So i'll keep it in CObject.

As part of this change I also improved CObject in other ways.

I was storing the weak reference to the object itself inside the object so I could implement explicit release and to avoid copying the pointer. I removed that reference and only store the pointer now. The WeakReference it already tracked in a hash table so I just look it up if I need it. This lets me change the jni code to use a field lookup rather than a function call to retrieve it (I doubt it makes much perf difference but I will profile it at some point).

I also had some pretty messy "cross-layer" use of static variables and messy synchronisation code. I moved all map references to outside of the weak reference routine and use a synchronised map for the pointer to object table.

For explicit release I simply call .clear() and .enqueue() on the WeakReference - which seems to do the right thing, and simplifies the release code (at least conceptually) since it always runs on the same thread.

Sunday 17 December 2017

jjmpeg & stuff

Well for whatever reason I got stuck into redoing jjmpeg and seem to have written most of the code (90%?) after a couple of weekends. It was mostly mandraulic and a bit tedious but somehow surprisingly relaxing and engaging; a short stint of unchallenging work can be a nice change. A couple of features are still missing but the main core is done.

Unfortunately my hope that the ffmpeg api was more bindable didn't really pan out but it isn't really any worse either. Some of the nastiest stuff doesn't really need to be dealt with fortunately.

I transformed most of the getters and setters into a small number of simple macros, and thus that part is only about as much work as the previous implementation despite not needing a separate compilation stage. I split most of the objects into separate files to make them simpler to maintain and added some table-based initialisation helpers to reduce the source lines and code footprint.

It's pretty small - counting `;' there's only 750 lines of C and 471 lines of Java sources. The 0.x version has 800 lines of C and 900 lines of Java, a big portion of which is generated from an 800 line (rather unmaintainable) Perl script. And the biggest reduction is the compiled size, the jar shrank from 274KB to 73KB, with only a modest increase from 55KB to 71KB in the (stripped) shared library size (although the latter doesn't include the dvb or utility classes).

There's still a lot of work to do though, I still need to test anything actually works and port over the i/o classes and enum tables at the least, and a few more things probably. This is the boring stuff so it'll depend on my mood.

Fuck PCs

In other news I finally killed my PC - I tried one more time to play with the BIOS and after a few updates it got so unstable it just crashed during an update and bricked the motherboard. Blah. I discovered I could order a new BIOS rom so i've done that and i'll see if i can recover it, otherwise I might get another mobo if I can still get AM2+ boards here, or just get another machine. I'll probably look into the latter anyway as it's always been a bit of a hassle (despite working flawlessly when it does and it's a very nice small machine.

Friday 8 December 2017

jjmpeg?

Well i've had reason to visit jjmpeg again for something and although it's still doing the job, it's a very very long way behind in version support (0.10.x?). I've added a couple of things here and there (recently AVFormatContext.open_input so I could open compressed webcam streams) but i'm not particularly interested in dropping another release.

But ... along the way I started looking into writing a new version that will be up to date with current ffmpeg. It's a pretty slow burner and i'm going to be pretty busy with something (relatively interesting, moderately related) for the next couple of months.

But regardless here are a few what-if's should I continue with the effort.

  • The old generator + garabage collection support required 4 classes per object.

    1. Abstract autogenerated native WeakReference based accessors with native-oriented methods (passing `pointers').
    2. Manually written native accessors as above and the glue to make it all work.
    3. Abstract autogenerated public accessors with public-oriented methods (passing objects).
    4. Manually written public accessors as above and the glue to make it all work.

    Whilst most of it is autogenerated the generator sucks to maintain and it's a bit of a mess. I've also since learnt that cutting down the number of classes is desirable.

    So instead i'll use the "CObject" mechanism with the WeakReference being a simple native pointer holder object which also knows how to free it. In this case at most 2 custom classes are required - one for autogenerated code (if that happens) and any helper/custom code.

    A few things require reflection going this route but the overheads should be acceptable.

  • Native memory was wrapped in native ByteBuffer objects.

    Originally the goal was to have java just access fields directly but in practice this wasn't practical as the structures change depending on the compile options so you end up with both the C and Java code being system specific, and the Java code requires a compiler to implement it (C being handled by gcc). A side-goal was to make the Java library bit-size independent without resoring to long - although that's all ByteBuffer uses.

    Because the objects are just wrapped on the pointer there is the possibility that multiple objects can be created to reference the same underlying C object (e.g. getStreams().get(0) repeated). Whilst this isn't as bad as it sounds one has to ensure the objects aren't holding any of their own state java-side for example. It also turns out that a direct ByteBuffer isn't terribly fast either from the Java side or looking up from the C side (not sure why on the latter).

    CObject just uses a long directly, which also precludes the likelyhood of poking around C internals by accident or otherwise. It also ensures unqiue objects reference unique pointers - this requires some overhead but it isn't onerous.

  • Using two concrete classes per object allowed the internal details of passing pointers (ByteBuffer) around to be hidden from the luser.

    • But it requires a lot of scaffolding! The same method written at least 2 times!
    • Although the C call gets a ByteBuffer directly, looking up the host pointer still requires a JNIEnv callback.

    CObject likewise uses an accessor to retrieve the native pointer, but because it's the super-class of all objects the objects can simply be passed in directly. That is native methods just look like java methods and so there is no need for any trampolines between method interfaces. It does require a bit more support on the JNI side when returning objects, but it's trivial code.

    An alternative would be to use a long and pass that around but then you still need the public/native separation and all the hassle that entails.

  • A lot of the current binding is autogenerated.

    Once the generator was written this was fairly easy to maintain but getting the generator complete enough was a lot of work. The biggest issue is that the api just isn't very consistent, and some just don't map very nicely to a Java api. Things such as out parameters - or worse, absolute snot like like AVDictionary that should never have existed (onya libav!).

    Each case required special-case code in the generator, often extra support code, and sometimes a fall-back to manually writing the whole lot.

    Working with zcl - and in that case the OpenCL apis are much cleaner and consistent - I discovered it was somewhat less work just to do it manually and not really any harder to maintain afterwards. At least in the case of the original ffmpeg the inconistency was simply because it wasn't originally intended as a public library, and I suspect the newer versions might be a bit better.

    I'm still undecided about simple data accessors as a good case can be made for saving the typing if there are many to write. So perhaps they could still be autogenerated (to an abstract super class as now), or they could be parameterised like they are in zcl (i.e. internally getInt(field) with public wrappers). Another half and half option would be to use the C preprocessor to do most of the ugly work and still write the Java headers by hand. Probably the last one.

Does anyone else care?

Wednesday 6 May 2015

post google code post

Well nobody bothered to comment about the stuff i removed from google code apart from the one lad or lass who lamented the loss of some javafx demos.

I had comments open+moderated for a few weeks but got hit by spammers a couple of days ago so had to go back to id+moderated. Maybe something got lost in those 500 bits of snot but i don't think so. The spam was quite strange; most mentioned web sites but didn't provide links or weren't very readable so i'm not sure what the point was. Perhaps they're just fishing for open sites or naive moderators they can then exploit. Like the "windows computer department" that keeps calling and calling hoping i'll not tell them to fuck off every time (sigh, no i don't normally say that although i would tonight).

I've still got the subversion clones but i'm not inclined to do much with any of it for the forseeable future and i'm not even sure if i'm going to continue publishing other bits of code i play with going forward.

Desktop Java, OpenCL, ARM assembly language; these things are just not very common in the Free Software world. Server Java is pretty common but that's just, well, `open sauce' companies sharing costs and not hobbyists. So i think all i'm really doing is providing hints or solutions for some student's homework or help for graduate programmers to keep their jobs. And even then it's so niche it wouldn't be many, if any.

As an example of niche, I was looking up some way to communicate with adobe photoshop that doesn't involve psd format and one thing i came across was someone linking to one of my projects for some unfinished experiments with openraster format - on the first page of results. This happens rarely but still too often. Of course it could just be the search engine trying to be smart and tuning results to the user, which is a somewhat terrifying possibility (implications beyond these types searches of course). FWIW I came to the conclusion photoshop is just one of those proprietary relics from the past which intentionally refuses to support other formats so it's idiot users can continue to be arse-reamed by its inflated price.

It's just a hobby

As a hobby i have no desire to work on larger projects of my own or other established projects in my spare time. Occasionally i'll send in a patch to a project but if they want a bunch of fucking around then yeah, ... naah. In hindsight i somewhat regret how we did it on evolution but i think i've mentioned that before. Neither do i need to solicit work or build a portfolio or just gain experience.

I'm not sure how many hobbyists are around; anyone with remotely close to enough skill seems to be jumping into the wild casinos of app-stores or services and expecting to make billion$ and not just doing it for the fun of it. Some of those left over just seem to be arrogant egotistical fuckwits (and some would probably think the same of me). Same as it ever was I guess.

I suppose I will continue to code-drop even if it's just out of habit.

For another hobby I made kumquat marmalade on the weekend. Spent a couple of hours in the sun slicing the tiny fruit and extracting seeds (2-3 cups worth of seeds) and cooked it the next day. Unfortunately after all that effort it looks like it wasn't cooked quite enough and it probably wont set - it's a bit runny but at least it tastes good. Not sure what i'll do with 2-odd litres of the stuff though.

Friday 13 March 2015

ahh shit, google code is being scrapped

Joy. I guess it's not making google enough money.

I'll make a copy of all my projects and then delete them. Probably sooner rather than later (probably immediately because i'd rather get it out of the way, i have a script going now). I'm keeping all the commit history for now although i never find it particularly useful. Links in old posts may break.

Some of them may or may not appear later somewhere else, or they may just sit on my hdd until i forget about them and delete them by accident or otherwise.

Wherever they end end up though it wont be in one of the other commercial services because i'm not interested in doing this again.

I need to do the same to this blog, ... but that can wait.

Update: So ... it seems someone did notice. Don't worry i've got a full backup of everything. In part I was pissed off with google and in part I wanted to make it immediately obvious it was going away to see if anyone actually noticed. Who could know from the years of feedback i've never received.

If you are interested in particular projects then add a comment about them anywhere on this blog. All comments are moderated so they wont appear till i let them through but there is no need to post more than once (i'll delete any nonsense or spam unless it makes you look like a bit of a wanker). I will see about enabling anonymous comments for the moment if they are not already on. I can then decide what to do depending on the interest.

I will not be using github or sourceforge nor the like.

In a follow-up post i'll see where i'm going with them as well as pondering the future location and shape of the blog itself. I've already written some stuff to take the blogger atom stuff, strip out the posts, download images, and fix the urls.

(in resp to Peter below) I knew the writing was on the wall when google code removed binary hosting: it's kinda useless for it's purpose without it. This is why all my new projects subsequent to that date are hosted differently.

Wednesday 12 February 2014

javafx + internet radio = sigh

I thought i'd look at porting the android internet radio player I have over to JavaFX; although jjmpeg is an option I thought i would first try with the JavaFX MediaPlayer. Thought it might be a simple distraction on a hot day.

Unfortunately it's a no go so far.

Firstly it just doesn't accept the protocol from the shoutcast server that i'm using: it sends a response with ICY 200 OK rather than HTTP/1.1 200 OK. Because mpeg is a streaming protocol normally players just ignore that and keep going even if they aren't aware of the streaming protocol (which they normally are).

Then I realised it requires ffmpeg 0.10 libraries to function (the same version jjmpeg/head uses, oddly enough - which seems pretty odd for such a new product) - so I pointed to my local build of those and at least got it playing local mp3 files ...

So since I had that going I hacked up a quick proxy server (it's something i wanted to look at anyway wrt android; so the player can extract the current song info from the stream) and after mucking about with some silly bugs I managed to get it to the point of segfaulting. If I save the content of the stream it will play that ok it just seems to have trouble with loading from a network. The proxy server just rewrites the ICY header response to say HTTP/1.1 instead.

Steaming Poo.

I'm using the latest jdk 1.8.0 release candidate as of 12/2/14. I suspect it's a version compatability issue with my ffmpeg build or it could just be a bug in the media code - given it works with local files and particularly since it's using gstreamer: the second would be no surprise to me at all because gstreamer is a pile of shit.

   Stack: [0x84dad000,0x855ae000],  sp=0x855ad1e4,  free space=8192k
   Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
   C  [libfxplugins.so+0x8d28]  cache_has_enough_data+0xc
   C  [libfxplugins.so+0x74b8]  progress_buffer_loop+0xbd
   C  [libgstreamer-lite.so+0x6bdd5]  gst_task_func+0x203
   C  [libgstreamer-lite.so+0x6cebd]  default_func+0x29
   C  [libglib-2.0.so.0+0x6c3a1]  fileno+0x6c3a1
   C  [libglib-2.0.so.0+0x69bd0]  fileno+0x69bd0
   C  [libpthread.so.0+0x5e99]  abort@@GLIBC_2.0+0x5e99

My proxy code is also totally shit so that's always a possible cause but it shouldn't cause a crash regardless.

Update: The above was on an x86 machine, also tried on an amd64 isa and it did the same. So probably version incompatabilities. Unfortunately those versions (at least) of ffmpeg can be compiled in binary incompatible ways even if the library is the same revision, and if it's been built against ubuntu or debian, well they like to mess around with their packages.

Update 28/2/14: So I thought perhaps it was something to do with the debian mess of using libav instead of ffmpeg. But trying that results in the same crash. I also tried the libavcodec.so.53 that comes with the gst-ffmpeg-0.10.13 package ... but that doesn't include avcodec_open2(). Hmm, so much for version numbers eh.

Also tried ffmpeg 0.8.x and 0.9.x. But they all just crash in the same place.

On my new pc I also tried using the version of 'libavcodec.so.53' that comes with gst-ffmpeg-0.10.13 ... but that 'version 53' of ffmpeg doesn't include libavcodec_open2(). Actually it looks like that is also using libav for fucks sake. Oh, so it seems that it's actually using ffmpeg 0.7.2 instead. How odd.

What were oracle thinking if it is built against libav - a buggy and insecure fork of another project and a version that is well beyond maintenance at that.

I guess I will keep trying different versions and see if one sticks. Pretty fucked up though - libraries have a version for a reason and so what's the point if they don't actually mean anything. Possibly the situation was compounded by the whole libav fork. Or it might just be a bug with gstreamer-lite.

Time passes ...

Ok, so 0.8.x and 0.9.x also provide 'libavcodec.so.53' ... and they all crash in the same place, so quite probably it's just the code in libfxplugins.

Shit, I even tried building openjfx ... but even though it has the source-code for libfxplugins.so ... it doesn't seem to build it and just copies it from the installed jre. It doesn't help that it uses gradle which is both slow and opaque.

See follow-up post.

Friday 24 May 2013

on google

So google have decided to disable downloads on google code.

So I have decided to stop using it.

... although as yet I have no concrete plans or timeline for when this decision will take effect.

Whilst they claim it's about abuse, one can only assume that is just a "likely-sounding excuse" for what in reality is just another straight-up lie from the PR department of a supra-national conglomerate, and it's really just a way to cut costs and promote their 'drive' service (a useless microsoft/apple only service as far as i'm concerned).

Nobody seems to have reported that they have also gimped their POP interface to gmail a couple of days ago. No more UID support. This makes POP a lot less reliable/useful as a mail store (although in honesty it was never designed for that purpose). I proceeded to delete all the mail in gmail to help them free up some disk space.

I guess over-all the writing is on the wall. We all know that at some point 'google account' will mean 'google+', and blogger may be retired at any time.

So it seems my on-going-but-totally-lax search for alternatives to 'everything google for convenience' just got another big kick up the rump-side.

As my projects are all pretty small and low-volume I might look at a local solution because every network based solution faces the same problem. I have a couple of beagleboards doing nothing although getting a running and secure-enough system might be more pain than it's worth.

It's a bit of a pain to have to deal with.

Thursday 4 April 2013

ActionBar, cramped screen-space, etc.

So after a few ui changes I tried jjjay on the tablet ... oh hang on, there's no menu button and no way to bring it up.

That's one ui design idea out the window ...

So basically I am forced to use an actionbar, and seeing how that is the case I decided to get rid of everything else and just use the actionbar for the buttons. Goodbye jog-wheel, it wasn't really doing anything useful on that screen anyway (unless I added a preview window, but that's going to be too slow/take up too much room on a phone). I also moved the scale/scrollbar thing to the bottom of the screen which is much more usable on a phone as otherwise you accidentally hit the action bar or the notification pull-down too easily.

It creates a more uncluttered view so I suppose it was a better idea anyway.

I fixed a few other little things, played with hooking up buttons to focus/action, and experimented with a 'fling' animator, but at this point some bits of the prototype hacked-up code is starting to collapse in upon itself. It's not really worth spending much more time trying to get it to do other things without a good reorganisation and clean up of some fairly sizeable sections.

I need a model for the time scale pan/zoom although I could live without that for the time being. A model for the sequencer is probably more necessary otherwise adding basic operations such as delete get messier and messier. There's still a bunch of basic interface required too; setting project and rendering particulars, and basic clip details such as transition time and effects.

So this is basically where the prototype development ends and the application development begins. 90% of the effort left? Yes probably. 6 days of prototype to 2 months of application development ... hmm, sounds fairly reasonable if a little on the optimistic side, generally it takes 3 writes to get something right.

I guess i'll find out if I keep poking at it ...

Wednesday 3 April 2013

threads, memory, database, service.

I hooked up the video generator to the jjjay interface last night and did some playing around on a phone form factor.

The video reading and composing was pretty slow, so I separated them into other threads and have them prepare frames in advance. With those changes frames are generally ready in Bitmap form before they're needed, and 80% of the main thread is occupied with the output codec. This allowed me to implement a better frame selection scheme that would let me implement some simple frame interpolation for timebase correction. The frame consumer keeps track of two frames at once, each bracketing the current timestamp. Currently it then chooses the lower-but-in-range one. The frame producer just spits out frames into a blocking queue from another thread - before I had some nasty pull logic and nearby-frame cache, but that is the kind of dumb design decision one makes in design-as-you-go prototype code written at some funny hour of the night.

I guess to be practical at higher resolutions hardware encoding will be necessary, but at SD resolution it isn't too bad. VGA @ 25fps encodes around 1-2x realtime on a quad-core phone, depending mostly on the source material. I think that's liveable. 1280x720 x264 at 1Mb/s was about 1/4 realtime. I suppose I should investigate adding libx264 to the build too.

I cleaned up some of the frame copying and so on by copying the AVFrame directly to a Bitmap, it still needs to use some pretty slow software YUV conversion but that's an issue for another day. Memory use exploded once I started decoding frames in other threads which was puzzling because it should've been an improvement from the previous iteration where all codecs were opened for all clips in the whole scene. But maybe I just didn't look at the numbers. I guess when you do the sums 5x HDxRGBA frames adds up pretty quick, so I reduced the buffering.

It was fun to finally get some output from the full interface, and as mentioned in the last post helps expose the usability issues. I had to play a bit with the interface to make the phone fit better, but i'm still not particularly happy with the sequence editor - it's just hard fitting enough information on the screen at a reasonable size in a usable manner. I changed the database schema so that clips are global rather than per project, and create and store clip/video icons along with the data.

So with a bit more consolidation to add a couple of essential features it'll approach an alpha state. Things like the rendering need to be moved to a service as well. And work out the build ... ugh.

At least I have worked out a (slightly hacked up) way to use jjmpeg from another Android project without too much pain. It involves some copying of files but softlinks would probably work too (jjmpeg only builds on a GNU system so I don't think that's a big problem). Essentially jjmpeg-core.jar is copied to the libs directory and libjjmpegNNN.so is referenced as a pre-build library in the projects own jni/Android.mk. If I created an android library project for jjmpeg-core this could mostly be automatic (apart from the make inside jni).

Incidentally as part of this I've used the on-phone camera to take some test shots - really pretty disappointed with the quality. Phone's still have a long way to go in that department. I can forgive a 100$ chinese tablet and it's front-facing camera for worn-out-VHS quality, but a 800$ phone? Using adb logcat on this 'updated, western phone' is also very frustrating - it's full of debug spew from the system and bundled software which makes it hard to filter out the useful stuff - cyanogen on the tablet is far quieter.

Rounding to the nearest day jjjay is about 5 days work so far.

Attached Bitmaps

On an unrelated note, I was working on updating a bitmap from an algorithm in a background thread, and this caused lots of crashes. Given it's 3KLOC of C+Assembly one generally suspects the C ... But it turned out just to be the greyscale to rgba conversion code which I just hacked up in Java for prototype purposes.

Android seems to do OpenGL stuff when you try to load the pixels on an attached Bitmap, and when done from another thread in Java, things get screwed up.

However ... it only seems to care if you do it from Java.

By changing the code to update the Bitmap from the JNI code - which involves a/an locking/unlocking step it all seems good.

Of course as a side-effect simply moving the greyscale to RGBA conversion to a C loop made it run about 10x faster too. Dalvik pretty much sucks for performance.

Update: Well add a couple more hours to the development. I just moved the rendering task to a Service, which was overall easier than I remembered dealing with Services last time. I guess it helps when you have your own code to look at and maybe after doing it enough times you learn what is unnecessary fluff. Took me too long to get the Intent-on-finished working (you can play the result video), I wasn't writing the file to a public location.

I cleaned up the interface a bit and moved "render" to a menu item - it's not something you want to press accidentally.

So yeah it does the rendering in a service, one job at a time via a thread pool, provides a notification thing with a progress bar, and once it's finished you can click on it to play the video. You know, all the mod-cons we've come to expect.

Monday 1 April 2013

A bit more on jjj

I spent a few spare hours yesterday and today poking at some of the other basic functions needed for an android video editor. First a list of icons that loads asynchronously and then some database stuff.

The asynchronous loading with caching was more involved than i'd hoped but I have it (mostly) working. Unfortunately Android calls getView(0) an awful lot, and often on a view which isn't actually used to show the content, so trying to match views to latent requests doesn't always work (for view 0 only).

I've done something like this before using a thread and request handling queue, but this time I tried just using a threadpool and futures. I tested it by loading individually offset frames from a mp4 file - i.e. it's pretty slow - and the caching works fairly well, i.e. it drops starting of processing frames you've scrolled away from so that the screen refreshes relatively efficiently. The GUI remains smooth and responsive. Apart from the item-0 problem it would be perfectly adequate. And in practice the images would just be loaded from a pre-recorded jpeg so most of the loading latency would vanish.

After that I came up with a fairly reusable class which lets me add 'async image loading and caching' to any list or gridview, which is a necessity on Android to save on memory requirements. I used it to create a graphical clip selector in only a few lines of code.

And today I filled out another big chunk - creating a database to store everything. I tried to keep it as simple as possible but it's still fairly involved. I was thinking of using Lucene just as an index, but then I decided I really did want the referential integrity guaranteed by using Berkeley DB JE. So I used that instead. It's got a really great API for working with POJO's and it's pretty simple to set up complex relational databases using annotations. The only real drawback of using itis that if you need to write complex joins and so on you have to code them by hand: but I don't need them here, and besides often the complex joinery in SQL means you have to write messy code to handle it anyway. It's not something I miss having to deal with.

Then I further hacked up the hackish sequence editor to fit in with the DB backend. Actually I hacked up quite a bit of glue to tie together all the GUI work I have done so far:

Projects
The main window is a project window, from which you browse the projects, and create another one. Clicking on a project jumps straight into the scene editor. I'm using a AsyncTaskLoader to open the database which can take a second or so.

A project will have a collection of scenes and probably an output format. Although initially I am only implementing it using a single scene otherwise it seems to add too much complication to the interface.

Scene
This is currently the sequence editor as mentioned in the last post. Tracks/layers can be edited using drag and drop. Clicking on the "add clip" brings up the Clip manager, and after a clip is selected it is added to the scene. Changes are persisted to the database as they occur.
Clip manager
This brings up a list of clips which have been defined for the given project. Clicking on a clip just selects it and returns it to the caller (i.e. the Scene). There is also an "add" menu item which then brings up a file requester (using the standard Android Intent for "get content") which lets you choose a video file. Once one is selected, it then jumps to the Clip editor.

The database tracks video files used in a separate table, but i'm not sure adding a separate GUI for that is terribly useful - just clutter really. But the table will be used if I want to transcode videos into a quick-editing format.

Clip Editor
And this is the first interface I mentioned a couple of posts ago. It lets one mark a single region within a video file, and then save it as a clip. The clip is then saved back into the clipmanager.

Its surprising how much junk you have to write to get anything going. It's not like a command line programme where you can just add another switch in a couple of lines of code, you need a whole new activity, layout, intent management, response handling, blah blah etc etc.

So I guess at another two half-days that takes the total effort to about 4, although i should probably round it up to 5 being a bit more conservative. A few hours more to hook in the prototype renderer and I would have a basic video editor done - not bad for a weekend hack, even if it was a particularly super-long weekend ;-). There are of course some fairly important features missing such as transitions and animations and so forth.

Although having it at this level of integration also begins to show up all the usability issues and edge-cases i've ignored until now. It is also where one can investigate alternatives fairly cheaply: e.g. does it really work having clips per-project, or should they be a globally available resource library? This is what prototypes are for, after-all ...

It's also starting to approach the point where the hacking for entertainment starts to turn south; maintaining and fixing vs exploring and creating.

Saturday 30 March 2013

jjj sequence editor

Sometimes I hate it when I have ideas in my head and they just wont go. Last night I was getting bored with TV and kept thinking about a sequence editor - the most complex single interface component I'm after - so I fired up the computer and hacked from 11pm till 3am or so. The the less sleep I get the more "anxious to get shit done" I seem to be, until I fall apart.

So although it's not much to look at yet - and the code behind it is even worse - last night together with a few hours today I came up with the following interface and all the guts to make it work.

I'm not sure if i need the jog-wheel there as the functionality is provided by the scale too, but it lets one move the timeline forwards/backwards. I also doubt i need to support any-number-of-layers, but I suppose I may as well - once you have more than one, any number isn't much harder.

The scale at the top can be dragged - it acts like a scroll-bar, or two fingers pressed on it can be used to zoom in and out.

The main sequence interface is modal, in that you are either working with 'layers' (or 'tracks'?) or clips. The "Layers" button is a toggle which switches modes, although I think I need a better interface design for that.

I don't have animations (i'm kind of up in the air on them as most of the time they just piss me off with their added delay) but the clips can be "drag and dropped" around the sequence in fairly obvious ways. So for example dragging a clip creates a layer-locked box which follows your finger, and a | indicator where it can be inserted. Letting go moves the clip to the new location, creating new layers as required.

Layers work similarly, except layers can be re-arranged, and the start position for the layer can be altered. All clips within a layer always run one after another. Doing things like moving the first clip causes the layer origin to be moved so the rest of the content isn't changed and so on, so it kind of 'just works' like you'd expect.

I have a basic single-item selection mechanism too, for deleting or other item-sensitive operations (e.g. where does 'add clip' add?).

So although the code is a complete pigs breakfast it is quite functional and stable, and with a bit of styling and frame graphics it's almost feature complete. Probably the last major functionality missing would be some start/end 'snapping' - but that's more of a nice-to-have.

Compositor too

Yesterday afternoon - when I was supposed to be relaxing - I worked on a prototype of a compositing and video creation engine. Initially I tried to use the ImageZ approach of compositing in floats, row-by-row, but android's java just isn't fast enough to make that practical. So I fell back to just using android's Canvas interface as the compositor. I guess it makes more sense anyway as it gives me a lot of functionality for free. I could get more performance bypassing it, but it's going to be a great deal of work. Obviously, I didn't attempt audio.

Initially I just wanted to see how performance was, but I ended up adding video and text "sources", which can be mixed/matched together. Performance is ... well it's ok, it's just a mobile platform with a purely CPU pipeline. A 10 second 800x600@25fps MP4 medium-quality clip takes about 20 seconds to decode+render+encode on the *ainol elf 2. For the compositor, about half the time is spent in decoding the source material and half spent in compositing, which isn't exactly flash - but it's all bytebuffers and Bitmaps and copying shit around multiple times so it's hardly surprising.

As usual the ideas in my head exploded yesterday and I thought about all sorts things: ways to create an interesting and usable interface, keyframe animation of clips and properties, multiple 'use modes' such as a simple video cutter, potential local client/server operation, using your pc to do the rendering, or using the tablet as an interactive control surface for a pc based application. But whether i'll have the time and motivation to see any of it through is another matter.

If i remember to, I'm going to try to track how much time i've spent on this project, so far it's about 3 days. I initially thought I could do a quick-and-dirty "weekend hack" and see how far I got and just leave it at that, and then thought the better of it. It's a 5-day long weekend for me so that's too much like hard work!

Friday 29 March 2013

jj .. jay?

Time to drop another ``prototype-bomb'' on the unsuspecting world ..

Ultimate goal is some sort of simple-to-use video editor, but to begin with there are a lot of user interface and algorithmic problems to work out. I tried to find some simple editors in the android store but they pretty much sucked - based on server processing, questionable advertising, difficult and confusing to use, and usually quite slow.

However, a few hours hacking so far and I have a fully functioning "clip" editor, which is one of the basic components required.

I've actually been meaning to play with something like this for quite a long time, and finally the planets aligned such that it was an opportune time to look into it (nothing to do with any internet goings ons). I will probably also dabble in a JavaFX version if I get anywhere with it; although the interface requirements there are very different.

Thursday 28 March 2013

jjmpeg android hardware decoding

I somewhat embarassingly just discovered that FFmpeg already has some support for using libstagefright for hardware video decoding on Android.

I always thought this was the place it should go, since mucking around with OMX or other junk is just such a pain, and it belongs there.

Bit of a headfuck getting it to compile, particularly since this adds a C++ dependency, and then after all that ... it just crashes.

I/ffmpeg  (10484): Assertion i < avci->buffer_count failed at
   /home/notzed/svn/jjmpeg-1.0/jjmpeg-core/jni/ffmpeg-1.0/libavcodec/utils.c:603
F/libc    (10484): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 10607 (VideoDecoder)
I/DEBUG   ( 1894): *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
I/DEBUG   ( 1894): Build fingerprint: 'samsung/m0zs/m0:4.1.2/JZO54K/I9300ZSEMB1:user/release-keys'
I/DEBUG   ( 1894): pid: 10484, tid: 10607, name: VideoDecoder  >>> au.notzed.jjmpeg <<<
I/DEBUG   ( 1894): signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr deadbaad
I/DEBUG   ( 1894):     r0 00000027  r1 deadbaad  r2 40126b0c  r3 00000000
I/DEBUG   ( 1894):     r4 00000000  r5 620f2a04  r6 60bfcc00  r7 61d0b8b0
I/DEBUG   ( 1894):     r8 5b127c80  r9 61bd5718  sl 000104d5  fp 61bd5718
I/DEBUG   ( 1894):     ip 5dc4bcbc  sp 620f2a00  lr 400f8c65  pc 400f52fe  cpsr 60000030
I/DEBUG   ( 1894):     d0  0000000200000000  d1  4b0000004b000021
I/DEBUG   ( 1894):     d2  000000094b000021  d3  0000000000000000
I/DEBUG   ( 1894):     d4  3ce5999061da5385  d5  3f34e1653f349a33
I/DEBUG   ( 1894):     d6  3f35287b3f356f75  d7  0106999ec0000000
I/DEBUG   ( 1894):     d8  0000000000000000  d9  0000000000000000
I/DEBUG   ( 1894):     d10 0000000000000000  d11 0000000000000000
I/DEBUG   ( 1894):     d12 0000000000000000  d13 0000000000000000
I/DEBUG   ( 1894):     d14 0000000000000000  d15 0000000000000000
I/DEBUG   ( 1894):     d16 8000000000000000  d17 ffffffffffffffff
I/DEBUG   ( 1894):     d18 416347d4c0000000  d19 3fe0000000000000
I/DEBUG   ( 1894):     d20 3fe0000000000870  d21 0000000000000000
I/DEBUG   ( 1894):     d22 0000000000000000  d23 0000000000000000
I/DEBUG   ( 1894):     d24 0000000000000000  d25 0000000000000000
I/DEBUG   ( 1894):     d26 0000000000000000  d27 0000000000000000
I/DEBUG   ( 1894):     d28 3f3504f3bf3504f3  d29 bf3504f33f3504f3
I/DEBUG   ( 1894):     d30 0000000000000000  d31 3f3504f33f3504f3
I/DEBUG   ( 1894):     scr 20000010

So yeah, i dunno. Perhaps it is a bug in jjplayer, but I tried removing all output handling, and it still just crashes inside codec->decode(), and the api to that point is so simple I don't think I can screw it up.

I've wasted half a day on this and it's losing it's interest very fast ...

Update: So, just when I was about to give up, I found a threading bug in the codec implementation. This at least lets it decode more frames but it's still crashing later on inside the GLES library.

I haven't got the NV21 image format working properly yet, so it may be related to that, maybe.

Update: Well that was pretty much a full day down the drain, I submitted a patch to the ffmpeg-devel mailing list (yay, subscribe to a high volume list of zero interest to me just to submit a 10 line patch).

It turned out that the file I was testing against refuses to play at all in the bundled players, so it's something to do with the hardware/firmware. At best they load the first frame then abort, although they don't segfault (probably the parser/demux stops it going as far as the codec).

On a video recorded from the camera it seems to work ok, although it certainly doesn't appear particularly smooth - I would need to do some timings to see what's going on. I haven't written the NV21 support either. And the build stuff needs cleaning up and parameterising before I can check it in.

If it's going to still be so useless i'm not sure I really care to be honest. And I've had more than enough for today so whatever I decide will have to wait.

Monday 25 March 2013

#$#@$ Android

One of those "I can feel the hair going grey and falling out" mornings. Although at least it wasn't one of those "I made every mistake I could have, every time I did anything" one. My throat is hoarse from all the screaming of definitely un-pc obscenities.

I'm trying to get some jjmpeg stuff working on android - but I can't use the jjmpeg android branch due to licensing issues. I need to use a fully shared library version.

Compiling the code was easy enough but trying to use shared libraries on Android turned out to be a complete head-fuck.

First, for some fucked reason although shared libraries are placed into a per-architecture location, the bloody library path isn't setup to automatically use it. So you have to hard-code all the architecture into the dlopen() calls.

Actually second, it's even worse than that, you need to use absolute paths for dlopen() otherwise it doesn't work. I just hard-coded it, a stack overflow answer stated that finding the path from Java was 'trivial', but then neglected to include which api trivially provided it. I didn't really want to change jjmpeg anyway.

And thirdly, it's flaccid linker doesn't support versioning of the shared libraries. To fix that I hacked up ffmpeg/library.mak to only create non-versioned sonames, and then hacked a build script to make it compile properly as with that change it didn't seem to want to build the libraries properly. I did this by specifying each lib*/*.so manually as the build target.

And the fun's only just started, i've still got a pile of C and NEON code I've got to get to work ...

Update: Oh fucking joy, no complex numbers now.

Update 2: Well I managed to replace complex.h with a few lines of #defines - fortunately they're complex numbers are in the compiler, just not in libm, and apart from the basic arithmetic I only needed creal/cimag and conjf.

And somehow, after re-arranging 1500 lines of C and 1500 lines of NEON assembly language, creating JNI wrappers, and hooking it into ffts and jjmpeg/head ... it worked almost first time. I'd set myself all week to get that far so i'm pretty chuffed, even if the headache from the days earlier tribulations hasn't yet receded.

I really have the beagleboard and a GNU operating system to thank for that - without being able to develop on a proper system first it would've been a nightmare.

Time for a couple of glasses of nice Barossa red methinks.

Saturday 12 January 2013

A/V sync II

Had another go at a/v sync this morning. So far it's looking quite good although it needs a good clean-up from so many aborted experiments.

I managed to remove most of the synchronous code that was being used to try and coordinate everything and instead I'm using a centrally coordinated sequence number.

Every packet and frame created has a sequence number associated with it which is maintained through the data flow graph. Every time a seek operation is performed the sequence number is incremented. This feeds through to each stage and lets them make cheap decisions on what to do:

  • Decoders just flush if the sequence number changes.
  • Decoders discard any packets which have the wrong number.
  • Renderers discard any frames which have the wrong number.
  • Renderers reset any output they need to (i.e. audio flush, re-sync).

I also use similar code to JJMediaReader so that after a seek it discards any frames which come through before desired seek position.

I still need some out-of-band callbacks for seek and pause because all of the above means the renderers may never see anything, but they don't need to do much.

The android player is somewhat broken so I need to fix that before committing anything.

Update: Got it working well enough and checked everything in. The new sync code causes some extra 'recovery time' after seeking, but once it's settled down it ends up with better sync. The recovery time is only noticeable on slow hardware.

Friday 11 January 2013

A/V sync

Had a bit of a limp stab at a/v sync today.

I started with something simple although it ended up a bit too complicated - trying to synchronise multiple decoding and display threads is a little messy. I was trying to hide as much as possible inside MediaReader and the Audio/VideoDecoder classes, but it got ugly.

But as far as the timing, the simple approach nearly worked as far as sync goes, there's just a couple of issues remaining:

  • After seeking on the container, the audio stream starts well before the video stream, so one ends up with a visual pause before starting, or even the video decoding never catching up (wrt packet buffering) and some nasty frame jumping.

    Hopefully this can be solved as I do in JJMediaReader: after a seek I discard packets and/or frames until the seek position is reached.

  • Getting the amount audio data "buffered" is a bit of a pain. Trying to use the reported position is fine until you seek since it throws the position out.

The simple approach was to have a central MediaClock object which tracks the audio renderer position, and then does various timing calculations for the video sync. It manages pause as well as interacts with seek. Maybe it isn't so simple ...

Eventually I should work it out.

Wednesday 9 January 2013

Preserving arbitrary Aspect Ratio in JavaFX

I had an attempt at displaying the proper aspect ratio in JavaFX, and after a couple of false-starts came up with a pretty simple solution. The ImageView does it's own aspect ratio preservation but for a WritableImage the pixels are always square - as far as I can tell.

So one must adjust it outside. First I just used:

        vout.setScaleX(aspect);

To set the ratio - this displayed the video properly but didn't take the adjust size into account during layout and fitting to the display area. Not really a show-stopper as the user can just adjust the window until it fits, but I was sure I could do better than that.

I tried various things such as placing the vout (an ImageView) into a Group and so on - but this didn't work (of course) as I was setting the dimensions of the ImageView relative to the window size.

Actually it turned out to be extremely simple: since I'm scaling the ImageView when it is being displayed, I just have to scale the inverse when I'm binding it's dimensions:

        vout.fitWidthProperty().bind(root.widthProperty().divide(aspect));

The height is simply bound to the root.height.

And also remember to take it into account on the initial scene size too:

        Scene scene = new Scene(root, width * aspect, height);
So apart from calculating the aspect ratio, those were the only lines of code required. Seems to work although i've only tested it on one PAL 16:9 video so far ...

Although the screen capture stuff just stores the unscaled frame still ... which is probably what I want tbh.

Well not a bad haul today for a poor nights sleep and feeling a bit crap overall.

Update: Added some shots to show it it works in JJPlayer.

Window too narrow - automatically scaled to fit horizontally:

Window too wide - automatically scaled to fit vertically (and controls fading out):

A raw non-corrected frame grab showing the image using square pixels:

More video player work

Mucked around with a few things in JJPlayer last night and this morning. Nothing on TV last night and I just wanted a couple of hours today.

  • Tried a much more aggressive fallback mode for frame skipping - it starts to drop decoding of frames as well.

    I'm not sure how useful it is if the cpu is just too slow - it eventually turns the video into a irregular slideshow, but can let the sound play fairly smoothly if there's enough time left. So maybe it has some use.

  • I noticed the "get audio location" isn't very reliable in Android (at least), so I changed the way it syncs to audio. I think get audio location is just reporting the head of the last buffer submitted to the sound device or something similar, so it jumps around compared to the actual time. So now it only re-sync's if it gets well out of whack. It seems to remove some annoying frame rate jitter, but as i'm playing back 50Hz material on 60Hz it's still pretty jittery.

  • I only allocate 3xtextures in the GLES backend if I don't need them to buffer. Less memory usage.

  • Moved some more functionality into the MediaReader class where it has the proper non-asynchronous state from which to make accurate decisions.

  • Added a full-screen mode in JavaFX (F11).

  • Bound the image size to the window size so it sizes to fit, maintaining aspect ration. Still assumes square pixels.

  • Added keyboard controls in JavaFX. Space to pause/unpause, left/right to skip back/forward 30s, page up/down for start/end of file, escape exits full-screen mode or quits, and the afore-mentioned F11 for full-screen mode.

  • Changed to SVG icons in JavaFX, and even if they're a bit ugly they're consistent. Converting Inkscape SVG to code is a bit of a pain so it's a bunch of hard-coded stuff plus the curved triangle thing.

  • Did some styling to make the buttons flat/borderless, they pre-light.

  • Fixed some other seek related stuff in JavaFX frontend. It's better but still not perfect. Pause/resume still messes things up sometimes.

The new buttons and styling.

There were a few annoying things along the way. The keyboard input was a bit of a pain to get working, I had to override Slider.requestFocus() so it wouldn't grab keyboard focus when used via the mouse (despite not being in the focus traversal), and I added a 'glass-pane' over the top of the window to grab all keyboard events. I had to remove all buttosn from the focus traversal group and only put the 'glass-pane' in it. I made the glass-pane mouse-transparent so the buttons still work.

Another strange thing was full-screen mode. Although JavaFX captures ESC to turn it off, the ESC key event still ends up coming through to my keyboard handler. i.e. I have to track the fullscreen state separately and quit only if it's pressed twice.

Next to fix is those a/v sync issues ...

Update Well instead I added a frame capture function. Hit print-screen and it captures the currently displayed frame (raw RGB) and opens a pannable/zoomable image viewer. From here the image can be saved to a file.

Although I haven't implemented it, one could imagine adding options to automatically annotate it in various ways - timestamp, filename, and so on.

Update 2: I subsequently discovered "accelerators", so i've changed the code to use those instead of the 'glass pane' approach - a ton of pointless anonymous inner Runnables, but it feels less like a hack.

        scene.getAccelerators().put(new KeyCombination(...), ...);

Update 3: I tried it on my laptop which is a bit dated and runs 32-bit fedora with a shitty Intel onboard GPU (i.e. worthless). It runs, but it's pretty inefficient - 2-4x higher load than mplayer on same source. Partly due to Java2D pipeline I guess but it's probably all the excess frame copying and memory usage. I tried to do some profiling on the workstation but didn't have much luck. Just seems to be spending most of it's time in Gtk_MainLoop, although the profiler doesn't seem to know how to sort properly either.