Friday, September 23, 2005

To stay alive on google

Who wouldn't want to be listed on google's directory?? Although this is something that I have read about in numerous "increasing your google rank" websites, but i was surprised to discover that your rank depends significantly on how often you update your content. However, I still believe that the top criterion is still the number of links you have linking to your page.

I have noticed that my blog has been loosing rank, in the past fews days since I have stopped publishing content. But again, a page that publishes frequently battling for rank against a page that has a 100 links to itself on the www, the one with the links wins (my theory). Some people have managed to hack google, and I noticed this when i was searching for something which i dont remember really (try searching for a famous celebrity's pics). You get search results with links to their pages, regardless of whether they have the content you are searching for or not - i.e. they have fooled the google crawler somehow.

Queen mary's information retrieval group does a lot of work in "search" research. The little knowledge I have on information retrieval comes from the chit-chat sessions I used to have with my fellow MSc classmates who took up the information retrieval course.

Meanwhile, on the news: Sony is planning to cut 10K jobs by 2008. My theory is that people still "love" Sony, but they just cant afford it anymore. This is on top of the fact that Sony's lost popularity in the mp3 player business (to Apple iPod) and in the LCD TV market.

Monday, September 12, 2005

XSLT + XML + CSS

Blogging continues to dwindle the number of personal home-page designers these days. People simply prefer to "blog" (which recently got coined as an english verb). Back in the days, when everyone was out there trying to put up a homepage on the internet, there were only a very few who would frequently update their pages with new content. If you were one of them, you would certainly have experienced the pain to update content. HTML templates came along the way but the extent to which they saved html design time was limited. New content still needed to be formatted, positioned, etc.

Web designing/coding soon started becoming more "structured" with the emergence of CSS. Since web-code carried such enormous amounts of fonts, a CSS-like construct was apparent. I also have been dying to add a comment on Javascript, the only other scripting tool (the other is vbscript) that been around since 1995. I feel Javascript was designed without the thought of future compatibility. It lacks intelligent abstraction. Its a "loose" language and lets you do things in ways in which you wouldn't expect other languages to.

Going back to blogging. It's really a two-step process: compose and publish. Recently, I have been thinking of implementing a blog-like architecture for updating my homepage. Apart from so many solutions, the simplest i could think of is using XML to carry content, and XSLT to carry out the transformation required to convert the XML to HTML. I am also planning to develop an application in C# which simply converts my composed text to XML format. The application would look similar to the "compose" box of email accounts. CSS can also be nicely integrated into XSLT, and different CSS fonts can be selected by the user from a drop-down box in the composer application during content composition.

I am pretty sure this is how news-websites update their content. They simple want to blog all their content on to the webpage, and avoid the hassle of redesinging. I am thinking more in the lines of a class of software which can serve as an interface to your webpage. For now, I will experiment with making the composer application for my homepage, and if all goes well, I will have developed a small blogger for my homepage.

Friday, September 02, 2005

Cardiovascularity ...

Atrial Fubrillation is a disorder found in almost 2.2 million americans. It is an abnormality where the atria of the heart is unable to pump out all the blood out of the heart. The pool of blood that remains may clot, and a stroke could result if such clots reaches an artery of the brain. 15% of strokes are caused due to Atrial Fubrillation. Catheter ablation is a treatment used to treat this disorder, and is normally used when drug therapy is no longer effective.

Catheter ablation is a very difficult and intricate process. A very minute device is inserted through the pulmonary artery of the heart. The pulmonary artery is usually accessed through the groin region, since it's relatively less hazardous to operate around that area. Once the device is inserted, it travels all the way up to the muscles surrouding the heart. The device has a special heater which can apply heat to tissues, killing them instantly. If we are able to successfully kill the tissues causing Atrial Fubrillation, we can remove this disorder in the patient. However, the most difficult part is to be able to point our "heater gun" accurately at these tissues.

People have been thinking of using machine vision techniques to assist Catheter ablation. Using Magnetic Resonance Imaging (MRI) together with endoscopy we should be able to produce a detailed 3D model of the heart; and especially the heart muscles surrounding the heart. Till December I intend to look at ways of doing this. The first step would be segmenting data-sets and locating the different clusters of data which represent different parts of the heart. Producing a 3D model, after identifying clusters, should be a fairly trivial task. We could use tools such as vtk. Using our 3D model we can explore the muscle tissues as much as we want. It forms the basis of a virtual endoscopy. It is like our test ground where we get to take as many shots as we want. We cannot falter when we take that one shot at the real muscle.