Information Age in 2014!

February 28th, 2006

An amazing video on the Information Age in 2014 is available here!

Its an approx. 8 min video which explains the evolution and spread of the Google religion. The clash of the Titans – Amazon, Google and Microsoft.

With the blogger, friendster, amazon, google grid we are moving to a highly personalized news – views – reviews global information system.

Can we have a brainstorming session in one of our meeting on what each one of us thinks about the information age of 2014?

Not your usual Swoogle

February 26th, 2006

When we named our semantic web search engine Swoogle, we registered but none of the other Swoogle top level domains. A .org address seemed appropriate since we had no plans to commercialize Swoogle and also wanted to mostly use as the canonical URL for branding reasons. Well, we should have spent the extra $10 to secure A Google Alert just informed me of a new web site at devoted to “Adult clothing galleries for women’s erotic day & evening clothes, sexy bikinis & swimwear, erotic lingerie fashions, and sexy shoes & boots”.

Why Swoogle? Well their official name seems to be “Sexy Women’s Online Galleries for Lingere Etc”.

We live in a fallen world.

Anybody know a good trademark lawyer?

Everyone needs this ontology

February 24th, 2006

“Everyone needs this Ontology”. The subject flashed itself in my mail client. No sooner had it arrived than it was gone. “Everyone needs this Ontology”? What could it be and where did it go? Could the spam filter have eaten it? I looked in my spam bucket and there it was.

Date: Sat, 25 Feb 2006 06:15:21 +0500Message-Id: < CFE0.9A2.3081698B8C@ >From: “Jennifer Mansfield”To: ontology@cs.umbc.eduSubject:  Everyone Need This OntologyHuge selection of meds availableat attractive prices.Highest quality assured.Try us out today..http : //

Spire: Semantic Web for Ecoinformatics

February 24th, 2006

This presentation describes the SPIRE project, a distributed, interdisciplinary research project that is exploring how the Semantic Web can be used in Ecoinformatics. Several tools are described including the Swoogle Semantic Web search engine and ELVIS (the Ecosystem Location Visualization and Information System), a suite of tools for constructing food webs for a given location. The presentation shows how the SPIRE tools are used to answer queries against multiple Semantic Web documents in the course of building a food web model.


ACM Information Technology job offshoring study

February 24th, 2006

ACM published a study on the globalization and offshoring of software with several findings:

  • Globalization and offshoring of the software industry are deeply connected and both will continue to grow.
  • Offshoring can, as a whole, benefit both, but competition is intensifying.
  • Offshoring will increase but determining the specifics is difficult. Skepticism is warranted regarding claims about the number of jobs to be offshored and the projected growth of software industries in developing nations.
  • Standardized jobs are more easily moved to developing countries than are higher-skill jobs. While these standardized jobs were the initial focus of offshoring, global competition in higher-end skills, such as research is increasing today.
  • Offshoring magnifies existing risks and creates new and poorly understood threats to national security, business property and processes, and individuals’ privacy.
  • To stay competitive in a global IT environment and industry, countries must adopt policies that foster innovation.

The study concluded that predictions of job losses were greatly exaggerated and that 2-3% of the US IT jobs will go offshore annually over the next decade. But more jobs will be created than are lost in the future as long as US industry moves up the economic ladder to do higher-value work — typically, applying IT to other fields, like biology and business. Employment in the IT industry is higher today than it was at the peak of the dot-com bubble, despite the growth of offshore outsourcing in the last few years.

(spotted on the CRA policy blog)

ITtalks semantic web application demo (c. 2001)

February 23rd, 2006

Semantic Web research at UMBC began in 2000 when we (UMBC, JHU/APL and MIT/Sloan) were awarded a research contract under the DARPA DAML program. One theme of our award was to investigate the integration of software agents and DAML. That winter we began developing a research testbed application which was a portal with information about information technology talks. The result was the ITTALKS system, which you can see in this demo done in June 2001. We recently found it gathering dust on one of our servers and uploaded to Google Video, partly for posterity and partly to give Google Video a try.


An Upper Ontology Summit will be held in Gaithersburg MD on 15 March 2006 to work toward developing a common core ontology to support knowledge interoperability.

There are a number of open, large, general purpose ontologies in existence today, including SUMO, OpenCyc Upper Ontology, DOLCE, PSL, OntoSem, WordNet and others developed in the Semantic Web community and application areas such as biomedical research. Such ontologies, often called upper ontologies or foundation ontologies, describe general concepts and entities that do not belong to a specific problem domain. General concepts to describe time, space and social roles, for example, are common in upper ontologies. Vocabulary to describe VLSI layouts are not. Having widely used and sharable upper ontologies are thought my many to be critical for knowledge sharing and interoperability. The current thinking is not so much that all domain ontologies must share the same upper ontology (although that would be very convenient) but that upper ontologies can be used as common references for ontology mapping and alignment among them.

The Upper Ontology Summit will be held at NIST on 15 March 2006 sposored by the Ontolog Forum, NIST, and SICoP. Its goal is to get the “custodians” of the leading existing upper ontologies to agree to a subset, an intersection of these ontologies, that could serve as a core to support interoperability among the future explosively growing population of ontologies. Representatives and stewards of many well known large, general purpose ontologies will be there, cinluding Cyc, SUMO, DOLCE, BFO and PSL. The one-day workshop and panel discussion event will be held at on the NIST campus in Gaithersburg MD on 15 March 2006. The workshop is free, but pre-registration by 6 March is required to enter their facilities.

Networked RFID systems

February 20th, 2006

The International Telecommunication Union (ITU) recently held a two day workshop on Networked RFID: Systems and Services in Switzerland. There are some useful and interesting presetations available on the program page, including audio feeds. The workshop focused on the use of RFID technology in networked environments and the current and future applications, services and business models leveraging networked RFIDs (NRFIDs). (spotted on Smart Mobs).

WordNet in RDF/OWL, Best Practice

February 20th, 2006

WordNet has always been an interesting source of knowledge. It began in the mid 80s as a project by Princeton’s George Miller with the aim of exploring new, computer-mediated models of dictionaries. I recall hearing him talk about this in 1984 or 1985.

WordNet is what might be called a lexical ontology, i.e. one focused on words and their meanings. It currently consists of about 150K words associated with 115K synsets (word senses or meanings) and is widely used as a resource for AI, NLP and IR projects. WordNet has been mapped into RDF many times and because of its broad coverage has been used in many ways, especially for annotating objects and ontology mapping. Using wordNet senses for tagging, for example, can provide more accuracy and reduce ambiguity.

The WordNet Task Force of the SWBPD WG has released a editor’s draft of RDF/OWL Representation of WordNet. It provides a standard conversion of WordNet for direct use by Semantic Web application developers. “By providing a standard conversion that is as complete as possible the TF aims to improve interoperability of SW applications that use WordNet and simplify the choice between the existing RDF/OWL versions.” (spotted on SWIG scratchpad)

Botnets are getting larger and smarter

February 19th, 2006

Brian Krebs’ Security Fix blog has additional material that didn’t make it into his Washington Post Magazine article on botmasters and malware. I found his discussion of botnet research to be very intriguing. Botnets are using protocols and algorithms developed over the past decade in the distributed systems and software agents research communities. Some botnets, for example, can contain hundreds of thousands of PCs making central control infeasible:

“But controlling the activities of tens of thousands of hacked PCs can take an enormous amount of computer processing power and Internet-access bandwidth. As such, botmasters have adapted their command-and-control networks to accommodate much larger botnets.One popular way to control large numbers of compromised machines is through delegation. For example, if a botmaster has compromised 100,000 PCs, but only has the capacity or bandwidth to control 10 percent of those computers, the attacker can organize the victim PCs into hundreds of much smaller groups, with a “lieutenant” bot in each group that orchestrates connections and communications between other members of the platoon and the bot herder’s main channel.

In such a scenario, the individual bots are democratic. Should a lieutenant suddenly be unplugged from the Web or discovered and cleaned up by a security professional, the remaining bots in the platoon are programmed to hold a virtual “election” to see which computers should replace it. In most cases, the PC with the fastest and/or most reliable Internet connection becomes the new lieutenant.

There is one factor in controlling vast numbers of bots that can mask the true size of any given botnet, Dagon said. To reduce the load that a massive botnet would place on a command-and-control network, many bots are configured to remain mostly disconnected from the herd, “phoning home” periodically to check for updates or new instructions.”

This makes me wonder if sploggers are using botnets. Setting up a splog on an infected machine would be more complicated for a bot since it would require a minimal HTTP server and also riskier since it entails both outgoing and incoming connections. We have a very large collection of sites identified as hosting one or more splogs. An analysis of the distribution of their IP addresses might be interesting.

The people behind botnets and malware

February 18th, 2006

The Washington Post has a very long feature article on botnets and malware that is quite interesting. It focuses more on the people involved — perpetrators, victims, and prosecutors — than on the technology. The centerpiece of the article is the story of a botnet master, 0×80, who clims to make $80K a year working out of his bedroom in his parent’s house.

Invasion of the Computer Snatchers
By Brian Krebs, Sunday, February 19, 2006; Page W10Hackers are hijacking thousands of PCs to spy on users, shake down online businesses, steal identities and send millions of pieces of spam. If you think your computer is safe, think again

Washington Post questions value of algebra

February 17th, 2006

The US may be in worse shape than previously thought. The Washington Post published a piece from Op-Ed columnist Richard Cohen yesterday questioning whether algebra should be a required subject in high school.

“You will never need to know algebra. I have never once used it and never once even rued that I could not use it. You will never need to know–never mind want to know–how many boys it will take to mow a lawn if one of them quits halfway and two more show up later–or something like that. Most of math can now be done by a computer or calculator. On the other hand, no computer can write a column or even a thank-you note–or reason even a little bit. If, say, the school asked you for another year of English or, God forbid, history, so that you actually had to know something about your world, I would be on its side. But algebra? Please.”

A post on Scientific American’s blog summed it up nicely:

No one in Hyderabad or Shenzen is calling for getting rid of secondary algebra requirements. Why? The math is simple:

No algebra = No calculus = No science = No technology = We’re totally *&$#FRTDG!!!!!

Ironically, the ads that google puts on Cohen’s piece on the web are for companies offering books and tutoring to help students learn algebra. One of the country’s most respected papers warns kids away from an outmoded, traditional academic subject and crass advertisers, out to make it rich, are pushing their wares on the poor kids.