UMBC ebiquity
2008 August

Archive for August, 2008

Are Russian users particiating in cyberattacks on Georgia?

August 13th, 2008, by Tim Finin, posted in GENERAL, Social media

Updated below.

In a post about the recent cyberattack of Georgian computers from Russian sites, the shadowserver site asks, “Is it possible the same thing that happened to Estonia is happening to Georgia? To put it quite simply, the answer is yes.” They offer the following as evidence.

“Lots of ICMP traffic and Russian hosts sounds a lot more like users firing off the ‘ping’ command and a lot less like some evil government controlled botnet. It did not take us long to find out what is going on. Much like in the attacks against Estonia, several Russian blogs, forums, and websites are spreading a Microsoft Windows batch script that is designed to attack Georgian websites. Basically people are taking matters into their own hands and asking others to join in by continually sending ICMP traffic via the ‘ping’ command to several Georgian websites, of which the vast majority are government.

The following text is a redacted version of the script being posted:

We have removed the actual commands and parameters of the script to avoid being a distribution point for it. However, you can see the raw list of targets that are being spread across the websites. This script has been posted on several websites and is even being hosted as “war.rar” which contains “war.bat” within it on one site. It would appear that these cyber attacks have certainly moved into the hands of the average computer using citizen.”

Their conclusion is that ordinary users are now participating in the continuing attacks on Georgian websites.

Update I (8/13): Ars Technica has a post, , that quotes experts who questions the idea that the Russian government was ever involved with the DDOS attacks.

“According to Gadi Evron, former Chief information security officer (CISO) for the Israeli government’s ISP, there’s compelling historical evidence to suggest that the Russian military is not involved. He confirms that Georgian websites are under botnet attack, and that yes, these attacks are affecting that country’s infrastructure, but then notes that every politically tense moment over the past ten years has been followed by a spate of online attacks. It was only after Estonia made its well-publicized (and ultimately inaccurate) accusations against Russia that such attacks began to be referred to as cyberwarfare instead of politically motivated hackers.”

Update II (8/14): A Google Blog Search query returns two results for the comment in the script posted by shadowserver. A search against Google’s main index turns up a few more that look like they are intended to share it with people who will use it. And, finally, a search over Google Groups returns no results. It looks like there are only about ten instances on open sites indexed by Google. I was not able to find anything using Technorati. it may be that there are online sites that Google is not indexing that are being used. If the script was widely distributed, it may have been done using mailing lists that are not indexed by google, either because they are marked as private or run by another company, like Yahoo.

Cyberwar between Russia and Georgia preceeded shooting

August 12th, 2008, by Tim Finin, posted in GENERAL

In an article in Wednesday’s New York Times, Before the Gunfire, Cyberattacks, John Markoff describes how the Russia-Georgia conflict broke out on the Internet weeks before the troops engaged.

“Weeks before bombs started falling on Georgia, a security researcher in suburban Massachusetts was watching an attack against the country in cyberspace. Jose Nazario of Arbor Networks in Lexington noticed a stream of data directed at Georgian government sites containing the message: “win+love+in+Rusia.”

Other Internet experts in the United States said the attacks against Georgia’s Internet infrastructure began as early as July 20, with coordinated barrages of millions of requests — known as distributed denial of service, or D.D.O.S., attacks — that overloaded and effectively shut down Georgian servers.

Researchers at Shadowserver, a volunteer group that tracks malicious network activity, reported that the Web site of the Georgian president, Mikheil Saakashvili, had been rendered inoperable for 24 hours by multiple D.D.O.S. attacks. They said the command and control server that directed the attack was based in the United States and had come online several weeks before it began the assault.

As it turns out, the July attack may have been a dress rehearsal for an all-out cyberwar once the shooting started between Georgia and Russia. According to Internet technical experts, it was the first time a known cyberattack had coincided with a shooting war.

But it will likely not be the last, said Bill Woodcock, the research director of the Packet Clearing House, a nonprofit organization that tracks Internet traffic. He said cyberattacks are so inexpensive and easy to mount, with few fingerprints, they will almost certainly remain a feature of modern warfare. “It costs about 4 cents per machine,” Mr. Woodcock said. “You could fund an entire cyberwarfare campaign for the cost of replacing a tank tread, so you would be foolish not to.”

There’s lots more of interest to read in the article.

Universities resist RIAA information requests

August 12th, 2008, by Tim Finin, posted in GENERAL

The Chronicle of Higher Education has an article, Antipiracy Campaign Exasperates Colleges, on how Universities are begining to resist increased requests by the RIAA to fight music file sharing.

“Talk to the chief information officer at just about any American university, and he will probably say that his institution has bent over backward to help the Recording Industry Association of America curb illegal file sharing on his campus. He will also tell you he’s angry.

On e-mail lists and in interviews, university CIO’s and other information-technology professionals say their mission is getting derailed and staff time is being overloaded by copyright takedown notices, “prelitigation settlement letter,” RIAA-issued subpoenas, lobbying efforts, and panicked students accused of piracy.

Now, feeling burdened and betrayed, some of those universities are quietly fighting back, resisting requests for information and trying to quash subpoenas. Those that do so, though, find that their past compliance — and the continued compliance of their peer institutions — is being held against them.

“We feel like we’ve been led down the garden path, and our interest in working in partnership and leading our mission as educators is now being used against us,” said Tracy Mitrano, director of IT policy at Cornell University.

For years the entertainment industry and higher education have considered themselves allies in the fight to curb illegal file sharing on campuses, most visibly through the Joint Committee of the Higher Education and Entertainment Communities Technology Task Force. Over the past year, joint-committee members from universities say tensions have grown, primarily because they feel betrayed by the industry’s lobbying to force filtering technology on university networks.”

Geographic distribution of social networking systems popularity

August 12th, 2008, by Tim Finin, posted in GENERAL, Social media

Using Google’s Insights for Search, Pingdom has “looked at 12 of the top social networks to answer a simple, but highly interesting question: Where are they the most popular?”. In their post, Social network popularity around the world, they surveyed MySpace, Facebook, Hi5, Friendster, LinkedIn, Orkut, Last.fm, LiveJournal, Xanga, Bebo, Imeem and Twitter. Their technique was simple: search for MySpace and use the “regional interest” estimates. Here are some observations they made:

  • Facebook is most popular in Turkey and Canada.
  • Friendster and Imeem are most popular in the Philippines.
  • LinkedIn is most popular in India.
  • Twitter is most popular in Japan.
  • LiveJournal is more popular in Russia than it is in the United States.
  • Orkut is more popular in Iran (10th country popularity-wise) than it is in the United States.
  • MySpace is the only social network which is most popular in the United States.
  • MySpace, LinkedIn, LiveJournal, Xanga, and Twitter are the only social networks in this survey which have the United States in their top five countries, popularity-wise. That is just five out of twelve.

The technique is simple and somewhat crude, but probably accurate enough for a first order approximation. It also provides data that compliments the data that these systems provide on the geographic distribution of their users.

Authentication via passwords or certificates?

August 10th, 2008, by Tim Finin, posted in GENERAL, Security

In a NYT article, Goodbye, Passwords. You Aren’t a Good Defense, author Randall Stross lays out the case against password-based authentication for the Web and argues for approaches that use public key certificates, like Information Cards.

We are all familiar with the problems of passwords — it’s too hard to keep track of multiple ‘strong’ passwords, so we use and reuse one or maybe a few simple ones. These can be all too easily compromised by password cracking, phishing or packet sniffing.

“The solution urged by the experts is to abandon passwords — and to move to a fundamentally different model, one in which humans play little or no part in logging on. Instead, machines have a cryptographically encoded conversation to establish both parties’ authenticity, using digital keys that we, as users, have no need to see. In short, we need a log-on system that relies on cryptography, not mnemonics.

As users, we would replace passwords with so-called information cards, icons on our screen that we select with a click to log on to a Web site. The click starts a handshake between machines that relies on hard-to-crack cryptographic code. The necessary software for creating information cards is on only about 20 percent of PCs, though that’s up from 10 percent a year ago. Windows Vista machines are equipped by default, but Windows XP, Mac and Linux machines require downloads.”

Stross argues that OpenID is not a solution, but just more of the problem:

“We won’t make much progress on information cards in the near future, however, because of wasted energy and attention devoted to a large distraction, the OpenID initiative. OpenID promotes “Single Sign-On”: with it, logging on to one OpenID Web site with one password will grant entrance during that session to all Web sites that accept OpenID credentials.”

I’ve not tried using Information Cards yet, but plan to try it. You start by downloading an identity selector client onto your computer. Microsoft offers CardSpace for windows and DigitalMe seems to be a popular one for various unix systems, including Mac OS X.

When you see this post, it will be 08:08:08 08/08/08

August 8th, 2008, by Tim Finin, posted in Humor

This post is scheduled to be released at an interesting moment in time: 08:08:08 on 08/08/08. The bad news is you’ve probably seen this sort of nonsense before — like eleven months ago. The good news is that we’ll only have to put up with it for another four years, after which we can take a break until early in the morning on January 1, 2101.

On Larrabee and how multi-core computers will change CS education

August 7th, 2008, by Anupam Joshi, posted in CS, GENERAL, High performance computing, MC2, Multicore Computation Center, Programming

My colleague Marc Olano recently blogged about the new Larrabee chip from Intel, which will be described in a SIGGRAPH paper in a session he is chairing. This chip, with multiple old Pentium type cores running at 1GHz, seems a logical culmination of the recent multi/many core trend. IBM’s plans with the Cell/BE, and perhaps with the newer generation Power Chips, are also headed in a similar direction. Short of material scientists doing some magic with high K dielectrics or airgaps or CNFETs or whatever, the trend seems to be away from a single CPU with more transistors running faster and faster to multicored chips not clocked very fast. There’s a good reason for it (heat), as anyone who’s had a high end laptop and actually put it on their laps can testify. Further down the road, even more complex parallel architectures are proposed, with MCMs on chip connecting optically, and perhaps even memory stacked on top of the CPU layer talking optically back and forth! In other words, a few years down the road, the default box on which a system builder will write code will be something other than a single cored CPU. Bernie Meyerson from IBM discusses such issues in his talks — I can’t lay my hands on a publicly available power point, but some of the ideas are discussed in a recent interview.

Do these developments mean that we should be rethinking Programming 1 and 2, especially for CS majors. Do students now need to think parallel or multi-threaded programming from day one? Can that be done without first doing standard imperative programming? Given the less than ideal state of high school CS education, is it realistic to expect that students will get Programming 1 (and maybe 2) in high school? In our department, we’re offering class on programming the Cell/BE, and a course related to GPU programming, but those are typically meant for seniors. How about courses further upstream. Should data structures and algorithms change — maybe concepts like transactional memory need to be introduced ? Should OS change — talk much more about virtualization, and redoing virtual memory when ample NVRAM is available and accessible from a core ?

MULTIPLY JOBS BY COBOL GIVING IT-JOB-SECURITY.

August 5th, 2008, by Tim Finin, posted in Programming, Semantic Web, Social media

Forget becoming a Pythonista or learning how to exploit the Semantic Web in social networking applications. Learn Cobol.

From today’s New York Times (In California, Retro-Tech Complicates Budget Woes), comes this note on how to avoid being an obsolete geek: be a Cobol hacker. The context is an emergency plan that California Governor Arnold Schwarzenegger tried to deploy to address a $15 billion budget deficit.

“Last week, with no budget agreement in sight, the governor issued an executive order terminating thousands of part-time and temporary state employees and slashing the wages of about 170,000 of the state’s full-time workers to the federal minimum wage.

But the California controller, John Chiang, says the state’s payroll system — which uses a programming throwback known as Cobol, or Common Business-Oriented Language — is so antiquated it would take months to make the changes to workers’ checks.

“In 2003, my office tried to see if we could reconfigure our system to do such a task,” Mr. Chiang told a State Senate committee on Monday. “And after 12 months, we stopped without a feasible solution.”

David J. Farber, a computer science professor at Carnegie Mellon University, said using Cobol was roughly equivalent to having “a television with vacuum tubes.”

“There are no Cobol programmers around anymore,” Mr. Farber said. “They retired centuries ago.”

Try this Google search for Cobol programming assignments and see how many results are returned.

Dell trying to trademark cloud computing

August 3rd, 2008, by Tim Finin, posted in cloud computing, Multicore Computation Center, Semantic Web, Social media

Cloud computing is a hot topic this year, with IBM, Microsoft, Google, Yahoo, Intel, HP and Amazon all offering, using or developing high-end computing services typically described as “cloud computing”. We’ve started using it in our lab, like many research groups, via the Hadoop software framework and Amazon’s Elastic Compute Cloud services.

Bill Poser notes in a post (Trademark Insanity) on Language Log that Dell as applied for a trademark on the term “cloud computing”.

It’s bad enough that we have to deal with struggles over the use of trademarks that have become generic terms, like “Xerox” and “Coke”, and trademarks that were already generic terms among specialists, such as “Windows”, but a new low in trademarking has been reached by the joint efforts of Dell and the US Patent and Trademark Office. Cyndy Aleo-Carreira reports that Dell has applied for a trademark on the term “cloud computing”. The opposition period has already passed and a notice of allowance has been issued. That means that it is very likely that the application will soon receive final approval.

It’s clear, at least to me, that ‘cloud computing’ has become a generic term in general use for “data centers and mega-scale computing environments” that make it easy to dynamically focus a large number of computers on a computing task. It would be a shame to have one company claim it as a trademark. On Wikipedia a redirect for the Cloud Computing page was created several weeks before Dell’s USPTO application. A Google search produces many uses of cloud computing in news articles before 2007, although it’s clear that it’s use didn’t take off until mid 2007.

An examination of a Google Trends map shows that searches for ‘cloud computing’ (blue) began in September 2007 and have increased steadily, eclipsing searches for related terms like Hadoop, ‘map reduce’ and EC2 over the past ten months.

Here’s a document giving the current status of Dell’s trademark application, (USPTO #77139082) which was submitted on March 23, 2007. According to the Wikipedia article on cloud computing, Dell

“… must file a ‘Statement of Use’ or ‘Extension Request’ within 6 months (by January 8, 2009) in order to proceed to registration, and thereafter must enforce the trademark to prevent removal for ‘non-use’. This may be used to prevent other vendors (eg Google, HP, IBM, Intel, Yahoo) from offering certain products and services relating to data centers and mega-scale computing environments under the cloud computing moniker.”

You are currently browsing the UMBC ebiquity weblog archives for August, 2008.

  Home | Archive | Login | Feed