Weblog on the Internet and public policy, journalism, virtual community, and more from David Brake, a Canadian academic, consultant and journalist

Archive for the 'Academia' Category | back to home

7 February 2003
Filed under:Academia,Copyright,Net politics at8:35 pm

lessig.jpg
Lawrence Lessig (above) was undoubtedly the star speaker at
The Politics of Code conference yesterday. His presentation was certainly the zippiest one and I particularly liked the diagram pictured above that shows simply how laws designed to protect copyright against violation (outer circle) by protecting copyright protective code (middle circle) can block access to copyright material even when existing copyright law (inner red circle) allows fair use access.

It was encouraging to hear that he believes that due in part to meetings like this one the public in the UK, US and elsewhere are slowly starting to grapple seriously with the issues though he told me that there is nobody he could name who has produced what he feels is a good theoretical alternative to the current regulatory messes…union amoco creditmortgage accreditedaccredited online degreeacceptance agricreditcredit addison avenue unionadelphia creditorunion credit federal alloyfederal union aea credit Map

5 February 2003

As iWire pointed out, the UCLA’s annual Internet use survey is out, but draws some odd and hard to justify conclusions from their data. “Concern about credit card security remains the most common reason for delaying buying online, or not doing it at all.” Well – the most common stated reason anyway. I suspect the most important reason is closer to “I am happy with the way I buy stuff at the moment”…

What are we to make of the explanation that 28.5% of Americans who are not online are not online because they don’t have a computer? That doesn’t tell us much about why they don’t have one. Ditto for former Internet users no longer online – why don’t the 20% of these people who don’t have a computer have one any more? And what proportion of people have dropped out? It doesn’t say!

Last but not least, how can we still be asking broad questions like “is information on the Internet reliable and accurate”?
That’s like asking “is information in the library reliable and accurate?” Well, sometimes yes and sometimes no!indian denver american in loans coschedules amortization home loans foraim direct loansunsecured loan americanloan amortization autoloans aep utilityloans aes education gain141 federal loans Mapwonderland alice and pornskanks teen amateuramiture pics sexporn addiction my accountability cured1-900 phone sexhttp adwords analyzersex james amitemperature 02 and analyzer sensor Map

2 February 2003

Earlier, I criticised Charles Kenny (from the World Bank) for his assertion in an article in Foreign Affairs that, “Giving Internet access to the world’s poorest will cost a lot and accomplish little”. Admittedly, he qualified that statement later in his article. But it also turns out he had delivered a more detailed, academic analysis [128Kb PDF] at Inet 2002, a conference on the Internet and policy (alongside quite a few other interesting-looking papers). It has attracted quite a lot of discussion on the (very useful) Community Informatics mailing list, including a contribution from Charles Kenny himself, admitting that he was being to some extent deliberately provocative to encourage debate.

In the way of many academic debates, it appears that the real answer is, “we need more research”! (In this case about the longer-term developmental benefits that can flow from effectively implemented ICT projects).

Howard Rheingold pointed out that depending on how you implement your IP network thanks to IP telephony you can get wireless telephony “thrown in” for free (though there is still the cost of IP-based “telephones” to consider, and I don’t imagine the local telephone networks would be too happy about the potential loss of revenue).

[Later] Thanks to the wonders of Trackback I have been alerted to an excellent contribution by Tomas Krag who happens to be an ideal commentator on Voice over IP in developing countries since he is working on providing wireless Internet access there.

It all reminds me a little of the criticism levelled by Sp!ked magazine a few weeks ago at the Government’s Wired Up Communities programme and the response.

Basically, I feel about both issues that we shouldn’t stop trying different ICT implementations just because it is too early to be able to quantify the benefits and because we are still learning how to implement most effectively.ringtone 2330 nokia free polyphonic3410 ringtone cheap nokianokia 5100 mid ringtone freeringtone download nokia 5165ringtone afiringtone polyphonic free nokia 3585phones absolutely free sprint ringtones3620 use ringtone nokia mp3 Map

30 January 2003

This is the area I am studying at the moment so I found this special report interesting.

It is a little lightweight but cites some useful books. It maintains among other things that the Internet may not after all be a big threat to authoritarian regimes and that it may lead to more direct democracy in democratic countries. It also goes over familiar ground on the issue of privacy.

21 December 2002

Gill Sellar was hired as project coordinator for the Albany GateWAy in 1999 – a service designed to act as a community web portal for a rural, dispersed community in SW Australia. Fortunately for us, she was also a PhD student who decided to do her thesis on the subject of her work – particularly whether such a project could be sustainable. I have been hosting a draft of her thesis for a while, and now she has provided the final copy (3Mb – or 1.75 Mb as a Zip file). It’s 315 pages long but well worth a look if you are interested in how virtual community services can be sustainable and help to build social capital in rural areas.

I have asked her if she could provide an “executive summary” and if she does so I will post it here or link to it.

27 November 2002
Filed under:About the Internet,Academia at11:53 pm

I’m trying to figure out what %age of the WWW (roughly) is covered by any one search engines and/or a reasonable selection of several.

For starters in March 2002 in a comparison of ten search engines, half of the pages found in 4 sample searches were only found by one search engine and another 20% were only found by two, which suggests to me that the proportion of the total number of pages indexed by at least some of these search engines is low. What we don’t know of course is how many “public” web pages there are out there that none of the search engines find.

The OCLC indicates that the number of web servers has roughly tripled since 1999 based on random IP sampling. I chose 1999 because according to a Science-refereed paper in that year – the only study that gives a figure for number of web pages I trust so far – there were around 800m web pages around at that time. Which gives a guesstimate of 2.4bn pages now.

This has to be low though both because of the overlap figure and because Google says it is indexing 3Bn+ pages. (of course avg number of pages per site has probably risen sharply).

Anyway, is there enough information here or elsewhere any of you out there are aware of which can give me the information I am looking for?! What sort of additional information would help you to calculate a guess? I thought this would be something someone out there would be keeping track of but it seems not.

If you have any ideas, please contact me ASAP (or if you prefer post something via comments below).

P.S. There’s lots of good search engine related information at searchenginewatch and searchengineshowdown but nothing that relates to this particular issue since the 1999 piece. I am not that surprised as it is more of a theoretical than a practical concern for most web surfers and website marketers.

21 October 2002
Filed under:Academia,London,Useful web resources at1:25 pm

I’ve been doing a little research on second hand books since I became a student.

First off, try searching ABE Books – it covers hundreds of second hand shops around the globe but you can ask it to narrow the search to the UK. Skoob and Unsworths (see below) both use it to index their own collections.

Unsworths
12 Bloomsbury Street
London WC1B 3QA

Tel 44 (0)20 7436 9836

Skoob

Brunswick Centre – The nearest Underground station is Russell Square on the Piccadilly line. The Brunswick Centre is 50 yards from the tube station, turn right out of the station, cross over the road and head for a large concrete edifice. If you know the Renoir Cinema (recommended), we are in the same complex.
(purportedly the largest second hand academic bookshop in London)

44(0)20 7278 8760

Judd Two Books 82 Marchmont St nr Russell Sq tube 7387 5333 (no website?)

Waterstones 82 Gower Street – one of the largest sources of new academic books in London but also apparently sells used books.

One a similar note, Amazon UK sells used books alongside new ones when you do a search.big of women free movies boobthumbs movies porn freequality movies porn freeporn movies free retrohardcore free sample movieswinger movies freemovie teen samples freetwink clips free moviefree pussy wired moviesdownload movies free xxx

? Previous Page