Senate plan: less stimulus for NSF, NIST, other science agencies

February 9th, 2009

The US Senate’s stimulus plan released at the end of last week has less money for US science agencies than the House plan from January, but the cuts were not as drastic as were feared. CRA reports in a post Senate Deal Protects Much of NSF Increase in Stimulus that

“The agreement does reduce the increase in the Department of Energy’s Office of Science by $100 million (so, +$330 million instead of +$430 million), and NIST’s increase would be reduced by $100 million (so +$495 million instead of +$595 million). But given the reports we were receiving as recently as yesterday evening about the possibility of no increase for the science agencies in the bill, this is a remarkable turn of events. The increase for NSF in the Senate bill will still be far less than the $3 billion called for in the House version of the bill, but NSF will be in far better shape in the conference between the two chambers coming in with $1.2 billion from the Senate instead of zero.”

Scientists and Engineers for America (a 501(c)(3) organization) has a detailed breakdown of the the stimulus package that passed the Senate Friday in Senate-passed stimulus package by the numbers. They also have a downloadable excel spreadsheet in case you want to crunch the data yourself. Here are some science highlights from their post:

NSF Research: $1.2 billion total for NSF including: $1 billion to help America compete globally; $150 million for scientific infrastructure; and $50 million for competitive grants to improve the quality of science, technology, engineering, and mathematics (STEM) education.

NASA: $1.3 billion total for NASA including: $450 million for Earth science missions to provide critical data about the Earth’s resources and climate; $200 million to enable research and testing of environmentally responsible aircraft and for verification and validation methods for complex aerospace systems and software; $450 million to reduce the gap in time that the U.S. does not have a vehicle to access the International Space Station; and $200 million for repair, upgrade and construction at NASA facilities.

NOAA: $1 billion total for NOAA, including $645 million to construct and repair NOAA facilities, equipment and vessels to reduce the Nation’s coastal charting backlog, upgrade supercomputer infrastructure for climate research, and restore critical habitat around the Nation.

NIST: $475 million total for NIST including: $307 million for renovation of NIST facilities and new laboratories using green technologies; $168 million for scientific and technical research at NIST to strengthen the agency’s IT infrastructure; provide additional NIST research fellowships; provide substantial funding for advanced research and measurement equipment and supplies; increase external grants for NIST-related research.

DOE: The Department of Energy’s Science program sees $330 million for laboratory infrastructure and construction.


JWS special issue on Semantic Web and Policy (free sample issue)

January 13th, 2009

Elsevier has made the January 2009 Journal of Web Semantics special issue on the Semantic Web and Policy our new sample issue, which means that its paper are freely available online until a new sample issue is selected. The special issue editors, Lalana Kagal, Tim Berners-Lee and James Hendler wrote in the introduction:

“As Semantic Web technologies mature and become more accepted by researchers and developers alike, the widespread growth of the Semantic Web seems inevitable. However, this growth is currently hampered by the lack of well-defined security protocols and specifications. Though the Web does include fairly robust security mechanisms, they do not translate appropriately to the Semantic Web as they do not support autonomous machine access to data and resources and usually require some kind of human input. Also, the ease of retrieval and aggregation of distributed information made possible by the Semantic Web raises privacy questions as it is not always possible to prevent misuse of sensitive information. In order to realize it’s full potential as a powerful distributed model for publishing, utilizing, and extending information, it is important to develop security and privacy mechanisms for the Semantic Web. Policy frameworks built around machine-understandable policy languages, with their promise of flexibility, expressivity and automatable enforcement appear to be the obvious choice.

It is clear that these two technologies – Semantic Web and Policy – complement each other and together will give rise to security infrastructures that provide more flexible management, are able to accommodate heterogeneous information, have improved communication, and are able to dynamically adapt to variations in the environment. These infrastructures could be used for a wide spectrum of applications ranging from network management, quality of information, to security, privacy and trust. This special issue of the Journal of Web Semantics is focused on the impact of Semantic Web technologies on policy management, and the specification, analysis and application of these Semantic Web-based policy frameworks.”

In addition to the editors’ Introduction, the special issue includes five papers:


Guess who is coming to grad school!

October 2nd, 2008

UMBC alumnus Alark Joshi (PhD 2007) pointed out this great comic yesterday on Jorge Cham’s Phdcomics site. It shows one upside to the current financial crisis. Now that might sound self-serving, since I am part of the higher education industry that stands to profit. I think our society benefits as a whole if more people pursue an advanced degree, especially if the alternative is to become a yet another hedge fund manager.




Five Cloud Computers and Information Sharing

July 28th, 2008

There is an interesting panel to open the Microsoft faculty research summit featuring Rick Rashid, Daniel Reed, Ed Felten, Howard Schmidt, and Elizabeth Lawley. Lots of interesting ideas, but one that got thrown out was the recent idea that maybe the world does only need five (cloud) computers. If something like this really does happen, then perhaps we’ll need to think even more aggressively about the information sharing issues — is there some way for me to make sure that I only share with (say) Google’s cloud the things that are absolutely needed. Once I have given some information to Google, can I still retain some control over it. Who owns this information now? If I do, how do I know that Google will honor whatever commitments it makes about how it will use or further share that information ? We’ll be exploring some of these questions in our “Assured Information Sharing” Research. Some of the auditing work that MIT’s DIG group has done also ties in .


Our MURI grant gets some press

June 12th, 2008

A UMBC led team recently won a MURI award from DoD to work on “Assured Information Sharing Lifecycle”. It is an interesting mix of work on  new security models, policy driven security systems, context awareness, privacy preserving data mining, and social networking. The award really brings together many different strains of research in eBiquity, as well as some related reserach in our department. We’re just starting off, and excited about it. UMBC’s web page had a story about this, and more recently, GCN covered it.

The UMBC team is lead by Tim Finin, and includes several of us. The other participants are UIUC (led by Jiawei Han), Purdue (led by Elisa Bertino),  UTSA (led by Ravi Sandhu), UTDallas (led by Bhavani Thurasingham), Michigan (Lada Adamic).


Borjas at UMBC

October 11th, 2007

The well known Labor Economist visited UMBC last week to give a lecture in our humanities series. Borjas is very well known in political circles for his economic analysis of immigration. More importantly, not only does he write scholarly papers, he actually blogs in a way that folks like me who haven’t even done ECON 101 can understand his points. I haven’t read any of his papers to see what they look like, but in his blogs he is fairly clear about his opinions on various issues related to immigration. See for instance this interesting post about “protectionism” on broadway! I don’t always agree with what he has to say, but it is always a pleasure to read well written posts that say something reasonable backed with some data and analytic rigor.

So I went to the lecture with great anticipation. I reached a few minutes late, and the room was already full. The presentation itself was good, but a bit of a letdown. Perhaps because he didn’t want to be too controversial in a “distinguished lecture” type setting ? He presented data (increase in immigration since 1964, concentration of that immigration in select areas making the effect local, confounding factors when you try to analyze wage effects of immigrants, the fact that the wage depressing effects of immigration have most hurt the lower strata of society, the fact that an average immigrant today earns less than the native born, which is a change from the 60s and so on). However, he didn’t go much further by saying something which is both true and a copout — namely that what policy implications you derive from this data will depend on what your objective function is. He joked about letting everyone in if the goal was to alleviate world poverty or somesuch.

I also noticed that he did not split his data into effects of legal and illegal immigration. It would be interesting to know if there are differences ? Amongst legal immigrants, does employment based versus family based immigration make a difference ? Especially when one of the things that the now dead “comprehensive immigration reform” bill was discussing was a points based system for immigration.


StopBadware campaign

January 27th, 2006

A good read at http://stopbadware.org, it seems to be a MEGA campaign by Google, Levono and Sun Microsystems.

“Several academic institutions and major tech companies have teamed up to thwart ‘badware’, a phrase they have coined that encompasses spyware and adware. The new website, StopBadware.org, is promoted as a “Neighborhood Watch” campaign and seeks to provide reliable, objective information about downloadable applications in order to help consumers to make better choices about what they download on to their computers. We want to work with both experts and the broader internet community (.orgs and .edus) to define and understand the problem.”


Models of trust for the Web

November 23rd, 2005

The Workshop on Models of Trust for the Web (MTW’06) will be a one-day workshop held on May 22 or 23, 2006 in Edinburgh in conjunction with the 15th International World Wide Web Conference. Tentative deadlines are January 10 for paper submission and February 1 for acceptance notification.

“There are three types of lies – lies, damn lies, and facts found on the Web.” — anon

“As it gets easier to add information to the web via html pages, wikis, blogs, and other documents, it gets tougher to distinguish accurate information from inaccurate or untrustworthy information. A search engine query usually results in several hits that are outdated and/or from unreliable sources and the user is forced to go through the results and pick what she/he considers the most reliable information based on her/his trust requirements. With the introduction of web services, the problem is further exacerbated as users have to come up with a new set of requirements for trusting web services and web services themselves require a more automated way of trusting each other. Apart from inaccurate or outdated information, we also need to anticipate Semantic Web Spam (SWAM) — where spammers publish false facts and scams to deliberately mislead users. This workshop is interested in all aspects of enabling trust on the web.”


Semantic Web and Policy Workshop wrap up

November 16th, 2005

The Semantic Web and Policy Workshop (SWPW) held at ISWC had some great presentations and discussions on policy-based frameworks for security, privacy, trust, information filtering, accountability, etc. The SWPW web site has the proceedings, papers, presentations and some pictures. Watch for announcements about a related workshop on Models of Trust for the Web that will be held at WWW2006.


CASCON 2005 Keynote – Rob Clyde @ Symantec

October 18th, 2005

Rob Clyde, Vice President of Technology, Office of the CTO @ Symantec Corporation presented his keynote today morning. Along with the usual security stuff he reported on some interesting statistics —

Clyde

  • Phishing is becoming an increasing threat as 3 to 4% of users respond to such mails — much higher than traditional e-mail spam.
  • In the first half of 2005 phishing increased from 2.99 Million e-mails/day to 5.7 Million e-mails/day.
  • 31% of online consumers are buying less due to increased web security threat.
  • US leads in the number of hacked machine reports followed closely by Germany.
  • Broadband penetration is actually increasing security threats. Many personal machines are now vulnerable to hackers using them as web bots for DOS attacks.
  • DOS Attacks are now a business. Such attacks are now available for as low as US $300. Where?

Some other interesting comments ..

  • The increasing speed at which worms propogate are now demanding better use of proactive measures.
  • In the absence of such measures Akamai and it’s expandable bandwith pipes are the only solution against DOS Attacks. Looks like more revenues to Akamai in the days to come! Maybe Akamai’s stock is in for a ride.

Finally, and of importance to us — Symantec is now working on compating web (and blog) spam. They see this as being one of the next big security threat.


ISWC Semantic Web and Policy Workshop

October 10th, 2005

The Semantic Web and Policy Workshop will be held at the 4th International Semantic Web Conference on 7 November 2005 in Galway, Ireland. The workshop is focused on two research areas:

  • policy-based frameworks for the semantic web for security, privacy, trust, information filtering, accountability, etc.
  • applying semantic web technologies in policy frameworks for application domains such as grid computing, networking, storage systems, pervasive computing and specifying agent communities norms.

In addition to presentations of nine submitted papers, Ora Lassila will give an invited talk on “Applying Semantic Web in Mobile and Ubiquitous Computing: Will Policy-Awareness Help?” and a panel of policy researchers will initiate a discussion of “The 2005 Web Policy Zeitgeist”. The proceedings is available and participants can register at the online.


RSS and Podcasts at UMBC

October 7th, 2005

UMBC website now publishes RSS for news and Podcasts.
(More )

Good move – subscribed!
Atleast now I will follow what should have been regularly checked by all students at UMBC.