P2P and Music Trends

Gracenote Music Map” – Flash application that displays visually information garnered from requests to the GraceNote online database of CDs.

http://www.gracenote.com/map/

Study Finds That P2P Users Not Less Likely To Buy CDs” – A newly study commissioned by Industry Canada, which includes some of the most extensive surveying to date of the Canadian population on music purchasing habits, finds what many have long suspected (though CRIA has denied) –  there is a positive correlation between peer-to-peer downloading and CD purchasing.

http://www.michaelgeist.ca/content/view/2347/125/

 

Posted in Music & Art | Leave a comment

MetaWrap JavaScript Library Back Up

There was a bad cable when we moved the servers around. All is good now.

Thanks to the people who pointed this out, unfortunately I was was so busy I didn’t get to attend to this for a week.

Now the bad news. Everything is going down again this weekend due to a major change to the power in the building – the good news is that this is the last of the major infrastructure changes and things should then settle back down again.

After that should have some really cool stuff to start demoing.

Posted in Downtime, JavaScript | Leave a comment

Still Alive

Just very very very busy…

Posted in Uncategorized | 1 Comment

Post Google Online Conversations – "Googpartee"

There should be a word for a conversation that is made online when both parties are Googling what the other is saying. It kind of enters a hyperspace of profound references – at times you are digging deep into a topic; researching because… you can… you have Google at your fingertips and you have the inclination, at other moments you are jousting with obscure references – laying clues to see if they can keep up.

I shall name you “Googpartee”

Should be an Olympic sport.

Posted in Music & Art | 1 Comment

Massive Is Renovating

The blog and javascript server was down for a few days. Should all be good now.

Posted in Downtime | Leave a comment

Another Windows Socket Limitation Fix

I have a secret project that I’m putting through non functional testing at the moment. Its ftping around 600,000 files in the space of a few hours – which means its violating the 4000 sockets in 4 minutes limitation in windows.

File #4902 System.Net.Sockets.SocketException: 
Only one usage of each socket address (protocol/network address/port) is normally permitted

I don’t want to have to run my application at about 1% of its available capacity.

Luckily the solution corresponds with the usual registry tweaks for windows to make it scale.

  1. Ensure that all sockets use the “keep-alive” (So windows can work out faster if the socket is closed)
  2. Push this HKLMSystemCurrentControlSetServicesTcpipParametersMaxUserPort to 65534
  3. Reduce the time wait HKLMSystemCurrentControlSetServicesTcpipParametersTCPTimedWaitDelay to 30 seconds

This means I can ftp 120000 files in one minute – which is much more like it.

Update #1

This seems to be part of the puzzle too.

And this and this

Update #2

8/6/2007 7:49:38 PM LOG Made 32450 in 22 minutes (927 connections per second)

With all of the above tweaks I managed this on a windows XP machine – it threw no “Only one usage” errors

 

 

 

Posted in C# | Leave a comment

Another article on the demise of the old EBOM Warehouse @ 144 Cleveland St (Lanfranchi's)

“One of the Keating! musicians lived in in a warehouse in Sydney’s Cleveland Street known as Lanfranchi’s – so called because the laneway behind it was where, in March 1981, Roger Rogerson shot Warren Lanfranchi in that memorable scene from Blue Murder – another very Sydney story.

Lanfranchi’s was a great labyrinth of rooms that had been performance venues, recording studios, installation spaces, a cinema and living quarters for several generations of Australian musicians. On June 25 the scores of young musos living, working and developing in Lanfranchi’s were finally evicted.

Four weeks ago Lanfranchi’s was one major strand of our city’s cultural DNA. Cut off. This is a city where pubs, the traditional live music venues, are pulling the plug more and more to make room for the pokies spreading like a blight across our city.”

http://www.smh.com.au/news/opinion/we-need-a-place-to-breed-our-cultural-dna/2007/07/27/1185339252792.html

 

Posted in Me Myself and I, Music & Art, Nostalgia for Misspent Youth | 1 Comment

Fast Loading Of XHTML as XML In JavaScript Using Msxml2.DOMDocument.*

This solved the issue I was having with MSXML not parsing XHTML as XML without an xml-declaration – which is the only format that Microsoft Expression Web will auto-detect as XHTML without forcing it to fail over to it. ([menu] Tools > Page Editor Options > Authoring > Secondary Schema = XHTML 1.0 Strict).

I coded up a simple test case

Normally for all things XML in Javascript I have a handy lib that normally abstracts this away for me but for a test-case I’m staying close to the metal.

var l_xml_document = new ActiveXObject("Msxml2.DOMDocument.3.0");
l_xml_document.async = false;
l_xml_document.validateOnParse = false;
l_xml_document.resolveExternals = false;
l_xml_document.load(p_file);

I’ll run through each line

var l_xml_document = new ActiveXObject("Msxml2.DOMDocument.3.0");

This creates an Object which is IE’s version of the W3C standard DOM Document. I ask for the MSXML 3.0 DOM Document by name because its the default for “Msxml2.DOMDocument” . This means I’m not going to get any surprises if the default goes to v6.0 and it changes behavior. V3 seems to be installed on all Windows machines by default. From memory it came as part of IE5.5

l_xml_document.async = false;

This disables asynchronous mode – in this case I’m happy to wait for the file to arrive

l_xml_document.validateOnParse = false;

This was the clincher – without setting validateOnParse to false I was unable to parse XHTML in the following format (which is valid XML – the xml-declaration is optional according to the w3c spec)

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="
http://www.w3.org/1999/xhtml" lang="en"><head></head><body></body></html>

I needed to add the xml-declaration at the start.

<?xml version="1.0" ?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "
http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="
http://www.w3.org/1999/xhtml" lang="en"><head></head><body></body></html>

When I did add the xml-declaration I had to force Microsoft Expression Web to fail-over this as XHTML, it would not detect it.

Using either the ‘xml-declaration’ or ‘validateOnparse set to false’ workarounds, DOMDocument was very slow to load the XML, which brings us to the next line

l_xml_document.resolveExternals = false;

This prevents Msxml2.DOMDocument from trying to load and DTD’s and validate your code. This can take 5 seconds per request – so you really don’t want this unless there is a real danger that the XML you get back will be not conform to XHTML.

l_xml_document.load(p_file);

Load it. Not explaining this.

Hope this helps someone. Special thanks to John-Daniel Trask and Chris Bentley who commented on my original post.

Posted in JavaScript, XML | 1 Comment

Parsing XHTML and XML vs IE and Microsoft Expression Web

I have a project where I want to treat XHTML as XML. Which should be trivial because XHTML is XML – and it is trivial with FireFox.

However I have been experimenting with Microsoft Expression Web as my primary HTML/CSS editor – simply because of its awesome CSS re-factoring ability.

However I have run into an issue.

For XHTML to be parseable as XML with the browser XmlDocument/Msxml2.DOMDocument object, I need the XML to be well formed. From experiment it looks like with IE I need to include the xml prolog xml-declaration <?xml version=”1.0″ … ?> directive at the start before the DOCTYPE or any processing instruction or the document does not parse.

According to the W3 XML Spec – the xml prolog should be optional, and if this worked in practice then I would have no issue – but it seems that the IE Msxml2.DOMDocument object will not load XML if you don’t have the xml prolog xml-declaration before the DOCTYPE.

So I am forced to add it. And according to the XML spec that’s not only just fine, its recommended.

w3c_xhtml.png

http://www.w3.org/TR/xhtml1/#docconf

But If I do that – in Microsoft Expression Web I get the following.

e4w_forces_nonxml.png

Its telling me I need to make my XHTML badly formed! I can only assume that despite having the perfectly legal xml prolog xml-declaration – its not recognising the DOCTYPE.

Could it be that the code to detect the DOCTYPE is not allowing for the xml prolog xml-declaration like it was with an previous issue in IE6?

Changing it to

<meta http-equiv="Content-Type" content="text/html; charset=utf-8" ></meta>

Just seems to bring on another error.

It also forces me into IE6 rendering mode (Which I suspect is a kind of Quirks mode).

In that mode CSS does not work as expected.

If I remove the xml-declaration <?xml version=”1.0″ encoding=”UTF-8″?> then it works – but my XHTML is now not XML (according to Msxml2.DOMDocument ) and I can’t process it in any of the browsers XMLDocument objects without getting errors.

So I can’t use Expression Web to edit any of my XHTML.

If I can’t find a fix for this (or an explanation as to what I am doing wrong), I’m going to have to revert back to using FireFox and Firebug for all my CSS work.

I’m going to attack this from the angle of getting the Msxml2.DOMDocument in IE to parse the XML with the DOCTYPE at the start (no xml-declaration) – perhaps there is a workaround there.

It seems to work fine in FireFox.

Perplexed.

 

Posted in XML | 7 Comments

The Right Of Reply

The awaited great streaming inevitable has arrived in the form of remixes of the original YouTube post “by” John Howard

What has been enlightening, shocking and surprising is the flurry of videos that have been appropriating the meta-data from the original video such that if you search for the original, your first choice is now from a half a dozen 30 second political videos which appear on the surface to be from other political groups.

This is a prime example of how SEO-Jamming can divert and subvert a campaign.

Here is a remix…

 

And another

 

And a reply in the tradition of YouTube

 

And here is one in the form of a party political announcement

 

This was by far the most entertaining

I can’t help but think that this attempt by John Howard to “connect” to people via YouTube has been soundly thwarted by people who either anticipated this move, recycled what they had lying around and or got their a(r|c)t together and firmly culture-jammed the original post into oblivion.

 

Of course we need some Clarke And Dawe from earlier in the year.

 

Posted in Politics, Web2.0 | 1 Comment