Forgot to mention that I ported the game engine to Xbox360 last week.
Encountered another issue similar to the ViewPort dimensions initialisation race-condition issue for iOS I blogged about previously. The fix was inspired by this discussion. The issue seems to date back to January.
The symptom is that touch events register are scaled up (or down) because the touch panel dimensions are not the same as the actual screen dimensions.
Not sure if this is 100% a bug because I can imagine a touch screen might sometimes have more or less resolution than a screen and when dealing direct with the hardware I have encountered this behaviour. The convenient solution is to start at the actual screen size by default.
I’ve resolved all of these race conditions by doing what I said I didn’t want to and placed a call to my engine initialisation code in Game::LoadContent();.
Some platforms seem to have the correct ViewPort after the call to Game::base.Initialize() while others seem to only work it out by Game::LoadContent();.
My experience with MonoGame is getting to the point where I almost feel up to creating a patch for this and the other issues but I see online discussions so I suspect that minds greater and more experienced than me are onto this.
For now they remain a TODO: as I have features to create.
Here some updated notes from my research last year into HEVC & UHDTV. I’ve aimed this at a general audience.
A new more efficient video compression format is coming that will eventually replace the current standard workhorse H.264/MPEG4-AVC used by online video services.
So with HEVC/H.265 current video formats could require half the bit-rate to stream or, to think of it another way, download at twice the speed.
There is also a new digital video format and screen size on the horizon, UHDTV or ‘4K’.
‘4K’ is a confusing term because there are a lot of them, which is why for television, UHDTV is technically the correct term. Marketing however will dictate if the snappier ‘4K’ ends up being used instead.
|Format||Resolution||Display aspect ratio||Pixels|
|4K Ultra high definition television||3840 × 2160||1.78:1||8,294,400|
|Digital Cinema Initiatives 4k (native resolution)||4096 × 2160||1.90:1 (256:135)||8,847,360|
|DCI 4K (CinemaScope cropped)||4096 × 1714||2.39:1||7,020,544|
|DCI 4K (flat cropped)||3996 × 2160||1.85:1||8,631,360|
|Academy 4K (storage format)||3656 × 2664||1.37:1||9,739,584|
|Full aperture 4K (storage format)||4096 × 3112||1.32:1||12,746,752|
UHDTV is equivalent to four Full HD (1080p) screens.
Now the amount of information you need to compress quadruples when you double the width and height. size*2*2
… but HEVC halves the file size from H.264. size/2
So we get size*2*2/2 = size*2
So the UHDTV format compressed with HEVC will only end up being twice the size instead of four times the size.
Let’s look at these two impacts in more detail.
This means that the delivery of current SD and HD video will be easier and cheaper.
Video will consume less bandwidth and consume less space on CDNs.
This will allow IPTV services to penetrate deeper into existing markets and enter markets where the network was previously too slow.
In other words, less buffering and speedier downloads for those currently capable while those with borderline network speeds will be able to sustain a stream, where before they could not.
It takes time for mainstream industry to move, so I’d bet that where we first see the impact of this technology will be illegal downloads (and yes, porn [SFW]) which will have file sizes made smaller or kept the same but delivered at a higher quality.
This means it’s more important than ever that content providers and distributors set the price to effort ratio for legitimate vs illegal content acquisition correctly.
This will accelerate the practical benefit of UHDTV enabled devices.
As for seeing UHDTV on Cable or Free to Air services, I’m going to go out on a limb here and suggest that..
In my opinion, I don’t think we will see UHDTV in a commercially meaningful way in Cable or Free To Air broadcasts till the end quarter of this decade, at the earliest.
Plug a PC with a ‘4K’ video card into a ‘4K’ Monitor and you have a workable system.
We are long past online video being ‘postage stamp’ sized. Online video is now at least as good a quality as what you can get on Free To Air and Cable Television.
UHDTV combined with HEVC means that for the first time, online video will be of a better quality than Free To Air and Cable Television.
I could imagine a Netflix ad campaign for a HEVC +UHDTV service along the lines of the old Trinitron ads. “You are only getting a quarter of the picture, watch TV on Netflix.“
Once IPTV set-top boxes are capable of HEVC and UHDTV then “HDIPTV “could go mass market.
Assuming anybody cares.
Just because we can do 4K does not mean we need it. The end result could be that our videos just get smaller and 4K screens are a flop that nobody wants?
Personally I think the age-old advertising lever of “better quality” will win out. Unlike 3D or ‘Connected’, 4K will make sense to consumers as an easy to understand and tangible quality intrinsic to the screen. 4K has a hyper-real quality that makes a big impression. It will seem a worthwhile purchase.
Let me know what you think. Evidence to the contrary or supportive is more than welcome.
If HEVC delivers on its promise, it’s good for online video services.
If UHDTV screens reach mass market, unless you live in Japan, online video delivery services have a good chance of beating and outclassing Free To Air and Cable.
It has already begun.
Exciting times ahead.
It was a little bit of a struggle as I had to create a different pipeline solution for iOS assets. I’m now giving all my projects a separate content pipeline.. for great symmetry!
The Makerbot 2 is widely acknowledged by the developer community to have a faulty design in key component.
I ordered a print of one from the guy who designed a replacement. I could have printed it and assembled it myself but paid for it as a form or reward/encouragement.
Then I put it in a box for 5 months.
— James McParlane (@DrMiaow) July 27, 2013
If I ever got into the situation where I could no longer print because the original faulty part had failed, I could fit it and carry on.
This weekend, that happened 🙂
Failed print. … Now printing the extruder upgrade… https://t.co/kTjMozDy2D
— James McParlane (@DrMiaow) July 28, 2013
Then I downloaded the latest revision of the fix and printed that 🙂
— James McParlane (@DrMiaow) July 28, 2013
Seems it’s impossible to in a cross platform manner know what your rendering surface size is actually going to be.
In XNA, in Initialize() you can query the ViewPort dimensions and they will be correct.
Seems that in many other targets you don’t really know this for sure. Most likely it will return 800×480 till you get to LoadContent().
On some devices what you ask for is not what you get.
Some consistency would be nice.
In the meantime I’ve had to give up on semantics and move all my Initialize() code that needs to know the Viewport size (like all of it) out into LoadContent().
Also, beware with the current Xamarin Studio on the Mac. It does not copy and paste properly if you have Windows CRLF ending in your files. It will mess up the end of your paste..
Let’s face it – the Android emulator almost always sucks. It’s perhaps the most broken bit of an otherwise OK development process for Android.
Nothing beats the fidelity of testing on the real hardware but you need to jump through a few hoops first.
For MonoGame with the Visual Studio plugin you can launch directly into the hardware, but there are two gotchas.
When you run adb devices from the command line and it responds with an empty list…
…and your device is plugged in, then you will probably need to update the USB driver to be the Android SDK ADB one supplied by Google.
It should now appear as an Android Device with an ADB Interface.
Running adb devices from the command line shows my OUYAconsole.
In Visual Studio 2012 if you can’t see your device, try restarting Visual Studio or rebooting..
You device should be an option when you F5 or Run/Debug or Deploy from Visual Studio.
Out of the box performance for just drawing text, rendering images and playing audio is fine. My unoptimised hexagon vector engine that runs at 60fps fine on my PC won’t get above 10fps on the OUYA.
One thing I might try is using a scaled down 720p render target and blit up to the 1080p of the OUYA,
There seems to be a workflow issue at the startup of a MonoGame app in Windows using OpenGL.
The solution seems to be to perform your own initialisation on the viewport.
The issue is that during Initialize() your app seems convinced it has an 800×480 display, no matter what you have asked for in your back-buffer.
This is not the behavior in XNA, so I’m declaring it broken in MonoGame OpenGL
When you set your preferred back-buffer dimensions to something like 720p…
graphics.PreferredBackBufferWidth = 1280;
graphics.PreferredBackBufferHeight = 720;
… the window might be that size but what renders for the first few seconds is an 800×480 rectangle in the corner.
My fix is to manually create the viewport. ApplyChanges() seems to not do anything immediately. My assumption is there is a WM_RESIZE message pending from that that will asynchronously, at some point update the viewport.
At some point after Initialize() when the core Update() pump fires up the App seems to recognise that it has a 720p display.
So the bit in yellow is the real magic and resolves it for me. All I am doing is explicitly creating a new viewport. This code crashes in XNA, hence the #ifdef for MONOGAME.
I still get the White rectangle for a fraction of a second at the start, but any code in Initialize() that asks for the viewport size, gets the right size.