http://news.bbc.co.uk/2/hi/europe/3144704.stm
Only in Ukraine, you say? Pity! Report on a competition which drew more than 300 partitipants; the objective being to demonstrate the most outlandish way to destroy your computer. As someone who has had considerable success with this, even without being in a competition, I can see how this might be an attractive event.
The fact that the winners got totally new computer systems says it all about this event.
http://www.oreillynet.com/pub/wlg/3812
The high technology drain from North America parallels the deskilling drain which went on before it -- the consequence, lost jobs. This article suggests some alterantives ways in which technology can be used to boost employment, and links to some other structural restorces.
My rubric here is always to ask: "What is the easiest thing to do?" because most of the time, that's what in fact gets done. What this author is asking for is a bit more difficult than the trend he opposes, so I am not sanguine about its success. It is, nevertheless, worth thinking about, and I would be delighted to be proved wrong.
http://www.majorgeeks.com/download.php?det=1405
Finding Pin 1 when you are doing construction/repair work can be a real pin in the rear -- I wish I had a nickel for every time I was looking at a machine with my face hanging out and wondering this, because I could have retired by now. This handy guide to the pinplexed can give some rules of thumb for working this out.
http://www.majorgeeks.com/download.php?det=1405
Microsoft applications and the operating system speak not only with a forked toungue, but also with one which is hard to comprehend. This utility will translate the error message codes into something meaningful for human beings.
The following seriues of white papers from Spirent Communications cover a variety of real-world testing and configuration issues:
General Network Performance Testing Methodology http://whitepapers.comdex.com/data/detail?id=1063826137_766&type=RES&src=KA_RES
Firewall Testing Methodology Using Spirent Solutions http://whitepapers.comdex.com/data/detail?id=1063826137_766&type=RES&src=KA_RES
Reality Bytes: The Importance of Realism for Improving Web Site Performance http://whitepapers.comdex.com/data/detail?id=1063826137_342&type=RES&src=KA_RES
The Spirent Solution: Enabling the Realism of Live Networks with the Precision of Lab Measurement http://whitepapers.comdex.com/data/detail?id=1063826137_825&type=RES&src=KA_RES
The Network Commandments: Ten Ways to Improve Quality and Performance http://whitepapers.comdex.com/data/detail?id=1063826138_924&type=RES&src=KA_RES
http://whitepapers.comdex.com/data/detail?id=1063197052_173&type=RES&src=KA_RES
When the packet misses a socket, knowing what the last known good configuration actually was may get you out of a pocket of trouble. This white paper: "Enterprise Network Configuration Change Management: A Practitioner's Guide" offers some insight into making this happen.
http://www.certmag.com/articles/templates/cmag_nl_infosec_content.asp?articleid=448&zoneid=39
We all know what prevention is worth, but sometimes we have to apply the pound of cure, and it is extremely helpful to have a set of tips for responding to something as traumatic as a worm attack, which this article summarizes. There are links to a longer version, but this requires sign up for membership, which I normally eschew in listing resources like these.
http://www.cioinsight.com/article2/0,3959,1213563,00.asp
Article indicating that the extent of security problems requires a management response equivalent to business process re-engineering -- in other words, security has to be designed in, not patched on. Links to additional articles provide additional perspectives on this issue.
http://www.ccianet.org/papers/cyberinsecurity.pdf
The title and subtitle of this paper pretty much sum it up: "CyberInsecurity: The Cost of Monopoly; How the Dominance of Microsoft's Products Poses a Risk to Security". Despite its alarmist title, it is written by a bevy of well-known security analysts from all sections of the IT industry, and presents an argument for a diversified software ecology clearly and concisely.
The report makes a point that is worth quoting: "The average user is not, does not want to be, and should not need to be a computer security expert any more than an airplane passenger wants to or should need to be an expert in aerodynamics or piloting.". The tendency to "blame the victim" in many of these cases is totally misplaced, and in fact impedes potential solutions.
A short commentary on this paper is available here:
http://mcpmag.com/news/article.asp?editorialsid=613
and a more extended commentary with reflection on the wider issues is available here:
http://news.com.com/2009-7349_3-5140971.html
Another more recent rebuttal is here:
http://www.nature.com/nsu/030922/030922-10.html
Article describing Philips Research's development of an electronic paper which can allow colour movies to display on a single sheet of e-paper. Marry the potential for ubiquity that this represents [can we fold it?] with the capabilities of extended access via wire, and a whole new information ecology could result.
Just by itself this is a development with major implications. Imagine, for example, downloading a book on, say, the campaign in Iraq, which would then update itself as time goes on. Similarly, any intellectual construct could be modified on the fly to take new developments into account.
Or look at it another way -- suppose I download a survey of astronomy, and then after reading it, discover that I am really interested in planetary astronomy -- I could then "expand" the section on planetary astronomy, to discover I am most interested in Mars. In any one case, I would not need or want "the whole book" -- in fact my initial download might take the form of an extended index.
Note this does not displace books -- it simply replaces them in those cases where books are not a particularly good vehicle. In those cases where material has intrinsic value, the book still reigns supreme.
According to this article, one of the first PCs saw the light of day in Canada 30 years ago [the weather was so bad it promptly went back into hibernation]. Even if we look at the mainstream, it is hard to believe the fact that over two decades have passed since the PC was introduced.
Again looking at the capactity of the Intel 8008 microprocessor and comparing it to today makes it clear how impossible it is to extrapolate from the past 30 years into the future 30.
http://sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2003/09/23/BUGJ21SE181.DTL&type=tech
A report on Silicon Valley's employment woes suggests that the boom level will not be recovered until 2010, which on the face of it looks like bad news. The sliver linings in this cloud suggest that time is available to cure infrastructure problems to make the Valley a more productive place.
Since growth in the Valley is expected to lag national [USA] growth, this suggests at least that there will be some jobs somewhere else.
http://www.knowledgestorm.com/info/user_newsletter/092003_APPLE/article_4.jsp
The reason why we don't teach about Apple products in IT is straightforward: Macs are so simple you don't need to teach people how to use/service them. However true this bit of folk wisdom may be, there are solid reasons to consider a Mac, and it may even be cost effective.
A quick overview of the niceness of the G5 is here:
http://www.knowledgestorm.com/info/user_newsletter/092003_APPLE/article_2.jsp
and the degree to which Apple has made advances in the server market are here:
http://www.knowledgestorm.com/info/user_newsletter/092003_APPLE/article_3.jsp
http://www.popsci.com/popsci/science/article/0,12543,473054-1,00.html
A computer scientist has built a robotic head which does a much better job of mimicking human appearance than anything previously done -- indeed, from the photographs in the article, it looks like it does as good a job as anything in the movie _AI_. The conceptual issues involved with this are explored in the article, since this touches a nerve and excites emotional responses.
Again, given the general tendency of mankind, we have to ask ourselves how this will be applied to pornography first, in order to get a handle on how the technology will spread. Thought about long enough in this context, along with other strands of development covered in this blog, and loss of sleep seems like the only reasonable response.
http://edition.cnn.com/2003/TECH/09/19/wow.tech.life.computing/index.html
Research on using DNA as a basis of computing and calculating devices has been ongoing; this report summarizes what has been going on, and what is likely to be coming down the pike. But it ignores one question which should be strobing like the end of a James Cameron movie: at what point is a "machine", based on organic principles, and displaying what appear to be cognitive abilities, no longer merely a machine?
Philosophers have debated cogently that substrate really makes no difference in the attribution of personhood -- bu what if there really are no substrate differences?
Suddenly Golem appears a lot less implausible....
http://search.ft.com/search/article.html?id=030924000814
Another look at who will buy 64-bit machines, with an angle which had not occurred to me -- some people will buy them simply because they are "hot" and "the latest thing". An ignorant market is still a market, casting doubt on Intel's claim: "Nobody needs a 64-bit desktop".
As the article indicates, there are still hurdles and obstacles to overcome here, although Apple G5 owners will not have so many [indeed, if anyone out there would like to contribute a top of the line 2-processor G5 to me, along with 23" Cinema display, for me to get some appreciation of this first hand, by all means go ahead!].
It would appear that Intel is in more of a bind here than the other CPU manufacturers -- though they have been in such binds before, and Houdinied their way out of it, so we will have to see.
http://www.fortune.com/fortune/techatwork/articles/1,15704,485825,00.html
The current chaos on the InterNet is generating both concerns and proposals. Here we have a cogent statement of the problem with the advice that business itself has a responsibility and an opportunity to apply better remedial measures than simply blaming Microsoft.
Among the more obvious, painful, yet perhaps unavoidable conclusions is that corporate security spending must increase, perhaps doubling. This may be the only course which can redeem this tragedy of the commons. The alternative is to abandon the InterNet altogether, which does sound like a cure being worse than the disease.
http://www.globetechnology.com/servlet/story/RTGAM.20030918.gttwtico18/BNStory/Technology/
The limits on Moore's Law are approaching like the end of the runway to an overloaded jumbo jet -- 15 year really is not all that far away. Assuming we will still need increases in processing capacity [when we consider that 15 years conservatively represents processors running at a 6 THz pace it is about as hard to consider how we will need such processing power as it is to imagine how we will keep them cool], then other methods will be needed to continue computing's colossal curve of conquest.
One option is nanotechnology, which in its "mildest" form -- that is, the capacity simply to create useful objects at the molecular form -- appears like a solidly-devolping technology. This article explores some of the potentials behind this development.
http://www.kurzweilai.net/news/frame.html?main=news.html?id%3D2418
Short article from a most stimulating source [Accelerating-Intelligence News from the Kurzweil organization] about a neural network based on a supercomputer. This will be the most advanced and powerful simulation of the human brain yet devised [and the degree to which it falls short of completeness is arresting], and should be a project well worth watching.
Oh yes, remember we can always pull the plug. Can't we?
http://www.usatoday.com/money/industries/technology/maney/2003-09-16-maney_x.htm
Progress in robotics proceeding in parallel along multiple axes are bringing us tools which are capable of useful work in complex areas like individual homes. Some of us [those who can afford such products] may have the option to live completely different lives in the future, as robots take over common and time consuming basic tasks.
The developments in hardware, software, and concepts currently ongoing make this a very promising line of development. After many years of failure on deliver, some of these blue-sky promises will deliver [I am still waiting for software which will really allow me to dictate into the computer better than I can type].
Once again, the iceberg metaphor is appropriate -- science fiction has, perhaps, done [and will continue to do] more useful thinking about what all this means than any other discipline. We should be aware that none of these rewards will be a boon unalloyed.
http://www.technewsworld.com/perl/story/31594.html
The bottleneck inherent in the system bus has become increasingly restrictive and apparent as memory speeds and, most particularly, processor speeds, have continued to ramp up. Not long ago, a bus:processor ratio of 1:10 was not uncommon, which seems excessive [as a rule of thumb, anything worse than 1:5 represents a genuine block]. Now it is possible to buy cheap PCs where the ratio is cover 1:20.
The Apple solution to this is simply to have a fast bus, but the issue is much more complicated in the PC world, as this article explains. With the development of the front-side bus, some of the difficulties involved were relieved, but now that processor speeds approach 3GHz, we once again are in danger of sinking below the 1:5 ratio limit. The fact that most manufacturers agree with the need to address this problem suggests that major breakthroughs in speed are possible, as Apple, again, has demonstrated with the G5.
http://www.computerworld.com/careertopics/careers/labor/story/0,10801,85055,00.html
The whole issue of IT employment and trends is something which results in a lot of FUD, simply because so many oxen are liable to be gored. This article argues that the employment loss in IT is both exceptional in its severity, and unlikely to be relieved. The degree to which automation and offshore outsourcing are affecting the purely technical aspects of employment makes concentration on management skills as these relate to IT the most rewarding career path to follow.
Given an effective unemployment rate in IT of some 6%, it is rather odd for companies to be making the case that H1-B visas should be more widely available, which in fact they are doing:
http://www.infoworld.com/article/03/09/17/HNh1bhearing_1.html
http://www.pcmag.com/article2/0,4149,1274182,00.asp
A comprehensive review of the new Apple Power Mac G5, equipped with dual 2 GHz procesors, 2GB RAM, and a 160GB hard drive, all of which drive the price up from the merely unsettling to the positively deterring, if you are an individual buyer. The article suggests that the G5 delivers on its promises, and is quite competitve in performance and price with Wintel machines configured around the same price point.
Although I would like a hardware RAID SCSI array for a machine of this price and quality....
http://www.extremetech.com/article2/0,3973,1274197,00.asp
The significance of Intel's "LaGrange" security initiative has been remarked upon before -- here is a good detailed review of what is going on with this initiative:
• Objectives and Components
• Policy, Target Markets, and Rollout
• Trusted Platform Architecture Review
• Inside LaGrande CPU and Chipset Modifications
• Protected Environment Setup – Initial Steps
• Launching Protected Domain
• Handling Special Cases.
Once the facts are understood, we can conduct the debate relating to this technology in a more rational and perhaps constructive manner.
http://www.pcmag.com/article2/0,4149,1276959,00.asp
Now 64-bit processing is available to us on the desktop, with three desirable [and pricy!] machines under review here, in addition to related links, including one to Apple's [not-so-pricy in comparison] G5.
But wait! There is more! Microsoft is speedily rolling out Windows versions to make use of both AMD's and Intel's 64-bit chips:
http://entmag.com/news/article.asp?EditorialsID=5966
The extreme technical viewpoint, with some beta tests and a great deal of additional technical information, can be found at:
http://eletters.wnn.ziffdavis.com/zd/cts?d=75-63-1-1-618817-2626-1
One of the major points being made here is that "who needs it?" = "Gamers!" -- but then, what gets adopted in business a short time later is all the technical development which gamers made possible in the first place. Again, the history of the 16-bit to 32-bit transition is worth reviewing for useful pointers.
http://story.news.yahoo.com/news?tmpl=story2&u=/ap/20030920/ap_on_hi_te/classroom_gadgets
If any segment of the population is gadget-happy it is teenagers, the greater part of whom are in school, particularly Secondary schooling. The "problems" such tools present educators are discussed in this article, along with some "remedial" measures which school systems are taking.
Yet in some measure there is a distubing resonance of the K-12 school system's habit of viewing everything which is different as a "problem". In fact, some consideration should be given to the degree to which these gadgets represent an new intellectual ecology, and should, therefore, become part of the curriculum rather than being seen as something to be resisted or suppressed.
I do not want to suggest this is simple, but I do want to suggest that attitudes often preclude possibilities -- which may, in the end, be simple, but should be far from satisfactory.
http://utilitycomputing.itworld.com/4606/030916helpthemselves/index.html
If things go adrift at the most basic level on a computer, all you have left to interpret and troubleshoot are beep codes [though I have seen systems with a misplaced processor which did not even beep]. Here is a downloadable summary of the beep codes of the major PC BIOS, to help people work their way out of this particular morass.
http://utilitycomputing.itworld.com/4606/030916helpthemselves/index.html
The concept of autonomic computing, already discussed in this blog, promises to help reduce the complexity of network management. The URL indexes an article on the topic, setting out what is ongoing, along with whitepapers, webcasts, news articles, and featured topics.
http://www.infoworld.com/article/03/09/19/HNindustry_1.html
The Developer Forum in San Francisco was the latest venue for expressing lament about the fact that the IT job market may be permanently down. A number of policies and trends [such as a continuing decline in technical education in the USA] mean that the trained workforce to staff the jobs is not there, even if the jobs were.
The advantages of outsourcing include tapping into a global pool of talent, although there are real difficulties here as well. It does seem clear that the more "conceptual" an IT element is, the more likely it is to be outcountried. It is easier to manage programmers remotely than to implement helpdesk repairs the same way.
http://searchwin2000.techtarget.com/bestWebLinks/0,289521,sid1_tax294061,00.html?Offer=w2kamd
A straightforward listing of annotated Web links to 64-bit computing: hardware, software, the background behind it, the latest news, and a glossary of terms.
http://searchwin2000.techtarget.com/originalContent/0,289142,sid1_gci916094,00.html?Offer=ws92233
Since Active Directory was the major new technology introduced with Windows 2000, continuing as a mainstay in 2003 Server, a good learning guide to its plan and design is certainly worth a look. This article goes into considerable detail, covering Active Directory basics, planning your Active Directory, Active Directory migration, and Active Directory deployment. Many of these topics are supported by white papers, articles, and web links; a set of Resources includes tips, tricks, and traps as well as some webcasts on more esoteric issues, such as migrating from UNIX or executing an Exchange Server upgrade.
http://www.microsoft.com/windows2000/windowsupdate/sus/
Courtesy of Serdar Yegulalp's Windows 2000 Power Users newsletter, here is a free download of a tool which allows network rollout of patches and updates.
http://itzone.ziffdavis.com/download.html
Site devoted to aiding small and medium size business use information tehnology, with downloadable white papers, plus relevant news, reviews, and articles. Some promotional material from HP is also provided.
http://www.siliconvalley.com/mld/siliconvalley/6804154.htm
Short article providing a clear indication of what grid computing is supposed to do, and what real applications it can support. Of course, when you look at the history of the InterNet, then the idea that everyone will have a supercomputer at his beck and call simply means we will get more pornography faster.
Another article, suggesting that vendors and purchasers are not quite on the same hymnbook page yet, can be found here:
http://www.infoworld.com/infoworld/article/04/01/16/03FEgrids_1.html
This article describes how grid computing is moving from the science laboratory into the business workplace:
http://www.nwfusion.com/news/2004/0126grid.html
An annotated list of utilities that one power user finds close to indispensible, with the added fillip that anchors in most entries take you to a page where you can download the item. In addition to networking utilities, the list includes a number of programmer's utilities as well.
http://news.com.com/2100-1006_3-5078125.html
AMD has announced that it plans to expand its 64-bit x86-compatible Athlon chip into destops and notepads today, while exploring an "x86 everywhere" architecture for the near future. AMD has extensive leveraged resources which allow it to contemplate this. The upshot could be surprisingly faster and more powerful dispersed devices like handhelds and embedded devices.
We may be making the world smarter and faster than we ever thought.
http://www.businessweek.com/technology/content/sep2003/tc20030916_6815_tc129.htm
Article discusses the need for reform and restructuring of the InterNet -- in a manner analogous with post 9/11, there is recognition of the cost of openness and a determination to change things. The problem in both cases is that the bathwater and the baby have to be carefully separated, and the nature of the InterNet decision-making process makes unintended consequences [because non-technical 'voices' will not be represented] more likely.
Yet, given that even I have agreed that Something Must Be Done, it is hard to cavil when you get what you wanted, pace Oscar Wilde.
http://www.improb.com/airchives/paperair/volume9/v9i5/murphy/murphy0.html
A 4-part article from the Journal of Improbable Research about the origins of one of the major laws affecting everything, not just IT: Murphy's Law. It seems entirely appropriate that at the time I visited it, one of the four parts in the series had not been posted.
http://news.bbc.co.uk/2/hi/technology/3109146.stm
In North America, we are quite used to paying flat rate for broadband access, and think this totally justified when we consider the pernicious results of per-minute access fees, especially as these apply in Continental Europe. This article argues that flat rate fees cannot be justified in an age of file sharing, because network congestion is the result.
Without disputing anything this article says, I suspect that the negatives of per-minute sharing in terms of the "InterNet commons" far outweigh any benefits that this might create. An alternative, if one was necessary, would be a tiered band of flat rate access, with anything above this band invoking a surcharge. This would mean that the high bandwidth users would pay extra, while the vast majority of people who use the Net for fun and profit would still continue on a flat rate.
One wonders what consumer resistance would result if flat rate was to be abandoned in North America.
While not directly concerned with IT per se, this blog covering all sorts of gadgets strikes deep into the heart of ultimate geekiness. As someone who fervently that "less is more" [except, of course, when it really is less] and who cannot find any of his cameras, only a couple of which work anyway, this site is not to my particular taste. But lots of IT professionals are interested in this sort of thing, and here is a good way to keep up with the silicon flood.
The point being: "Everyone to his own!" as the old lady said when she kissed the cow.
http://wired.com/news/politics/0,1283,60461,00.html
The abuses perpetrated under the Digital Millenium Copyright Act are sufficiently well known among IT folk, but now, from this article, it appears that at least one USA Senator understands what the balance between corporate and individual rights ought to be. He has introduced a bill which would significantly up the ante in difficulty for industries seeking to force data from an ISP and by extension, to violate the privacy of an individual consumer.
The absences of checks and balances in the DMCA should be a serious concern to all those outside the industry who are concerned not only with rights, but with maximizing information flows for the greatest public benefit. One has to hope that this legislation represents the first step in the rollback of rights losses.
http://aroundcny.com/technofile/index.cfm
A rich selection of articles and essays, covering technology in general [for example, dealing with digital photography], but with many useful reviews, essays, and advice on both hardware and software, including operating systems. The page is neatly and cleanly indexed and laid out, giving fast loading and easy navigation.
That the principal's picture looks a lot like Stacy Keatch may or may not be an issue.
http://www.pcworld.com/news/article/0,aid,112519,00.asp
Intel is no stranger to social controversy stemming from technical decisions, and it has wisely seen that the "LaGrande" technology, which had all sorts of potential to make information decidedly 'unfree', is not a blanket solution. Some Intel chips will use the technology, and others will not, and people will be free to choose which variety they want. Nor will 64-bit Itanium chips have the technology.
The LaGrande technology is expected to be in production by 2006. It is pleasant to see that Big Brother will not be a necessary add-on to every Intel product purchased by then.
http://www.businessweek.com/technology/content/sep2003/tc20030916_9564_tc129.htm
The article indexed by this URL is interesting enough in its own right, indicating how many vital public networks remain vulnerable to destructive intrusion, what countermeasures are required, and what these will cost. This is a useful reminder of priorities and problems as they relate to high-value systems, as well as the degree to which we have developed an inextricable dependency upon them.
In addition, a wide variety of interesing special reports, reviews, and other articles relevant to IT and networking are indexed in the sidebars.
http://www.vnunet.com/News/1143673
Article summarizing an ITU report indicating that broadband subscriptions grew 72% in 2002, to some 62 million, representing something like 20% of the global community using the InterNet. The upside of this is a growing market which can take advantages of broadband services and products. The downside is the degree to which most InterNet users are still drinking an Atlantic Ocean of content through a very thin drinking straw.
http://whitepapers.comdex.com/data/detail?id=1063320526_115&type=RES&src=KA_RES
The rapid development of a whole host of security threats, particularly to Microsoft systems, has made patch management a major issue. Surveys showing significant numbers of unpatched servers are cause for much hand-wringing and finger-pointing. In this paper: "Give IT Managers a Break: Why Patching Is Not Easy", some of the mitigating circumstances are discussed, along with a recognition that patch management is a difficult, wearisome, and time-comsuming process.
Another white paper which addresses "Taking the Risks and Guesswork Out of Patch Management", in an area which amounts to a $2 billion/year expenditure, can be found here:
http://ewrma.com/ct?id=5393-3181-3409848
Microsoft, for one, is trying to reduce the problems caused by patching, as explained in their white paper "Beyond Security Patching", available here:
http://whitepapers.comdex.com/data/detail?id=1076340273_667&type=RES&src=KA_RES
http://whitepapers.comdex.com/data/detail?id=1063389252_145&type=RES&src=KA_RES
The increasing scope of attacks on networks makes a white paper covering "Optimizing Security and Network Operations" particularly useful.
Microsoft's loss of a patent infringment suit relating to Web browser display technologies not only inflicted a fairly hefty fine on it, but also put at risk the capacity of browser service for its most widespread tool. The article recounts efforts being made to minimize this damage, and affects an optimistic stance.
This, like the recent decision legalizing scumware, is another example of a narrow application of the legal system resulting in untold harm for hundreds of millions of InterNet users. One wonders how much longer such a pernicious and iniquitous system of behaviour can continue to exist, or why the general community does not rise up in protest.
http://utilitycomputing.itworld.com/4604/030910differentiate/index.html
Links to a number of downloadable .PDF files on grid computing, plus some articles and webcasts relating to this technological development. Grid computing, if properly managed, has the potential to stand what we mean by "computer access" on its ear.
Here is another article covering the development of grid technology:
http://www.computerworld.com/softwaretopics/software/story/0,10801,85884,00.html
A good summary of the technical and nontechnical issues is provided in this article:
http://www.economist.com/business/displayStory.cfm?story_id=2352183
http://www.eweek.com/article2/0,4149,1269036,00.asp
The CTO at Intel opines that desktop PCs will not need 64-bit computing for several more years. Of course the fact that AMD has a backwards-compatible 64-bit chip available and Intel does not has nothing to do with this expression of viewpoint. Nor does the existence of the Apple G5 line.
Seems to me I can remember something like this before, when the 32-bit processor was first mooted: "Who will need a GB of memory?" was the rallying cry then. There is no reason to think the future will be the same as the past, unless, of course, we have reason to think so -- and in this case, I suspect we do.
http://www.economist.com/printedition/displayStory.cfm?Story_ID=2051736
Article describing efforts at the IBM Almaden Research Center to build computers using "collective intelligent bricks", currently enabled as 8" cubes with 80GB of disk data storage, a CPU, and metallic connectors which enable the bricks to be stacked together, communicating without wires at a high data rate.
The individual components of this technology may not impress, but a simple 3 x 3 x 3 array of such computer bricks could store 25TB [or one Library of Congress]. A multiple-brick system would be computing's version of The Immortal Chicken heart, although cooling problems represent a major issue to be addressed.
http://www.computerworld.com/managementtopics/outsourcing/story/0,10801,84861,00.html
Article discussing how offshore outsourcing in the IT industry is a trend which can only grow in impact. Discusses the various types of outsourcing as well as the effects of these on jobs within the North American sector. The bottom line: outsourcing is "an irreversible megatrend" in the IT employment sector which has deep and continuing penetration in the industry.
http://www.wired.com/news/politics/0,1283,60453,00.html
The USA Senate has cut some of the budget submitted by the Defense Advanced Research Projects Agency, which acts as the cutting-edge military technology supporter for the USA. Much of the legislative ire is directed against the Information Awareness Office, which represented a dubious initiative at the best. However other projects which have potential civilian as well as military benefits, such as a project to control mechanical limbs using thought, have also been axed.
Given the average level of technological sophistication of most congresscritters, having them make micromanagement decisions in what was the crown jewel supporing USA military effectiveness and advantage does not seem like a particularly beneficial idea.
http://siliconvalley.internet.com/news/article.php/3076961
The issues involved in the concept of the "semantic Web" have been touched on previously, and as this article reports, developments in standards are proceeding apace. The result will be a more powerful, effective, and user-friendly Web, which in turn ought to guarantee continued interest in the InterNet as an enabling technology.
As one commentator notes, bloggers are at the forefront of this development.
http://www.villagevoice.com/issues/0337/baard.php
The rush towards military robotics is another of these technological streams which is making science fiction fact. Some scientists, for personal reasons, are refusing to accept sponsorship or direction from military sources. There are, however, sufficiently many researchers who are perfectly willing to work on such projects that the overall thrust of developments in this arena is unlikely to be halted. "The Terminator" it isn't, but robotic innovations [some of which have already been field tested] which help make the difficult, dangergous, dirty, and undesirable job of city-fighting more manageable while reducing friendly casualties represent just one strand of this effort. Those sponsoring it, and more important, the soldiers using it, are, in general, firm advocates of this technology.
Of course the ultimate upshot of this antimilitary stance is that people will die, instead of machines.
While the Computer Security Site is a good source in itself to keep tabs on training for security in general and the CSI certifications in particular, as this entry is written, in addition there is a downloadable report avilable: the eighth annual Computer Crime and Security Survey, which clearly shows that computer crime and security remain significant issues.
http://www.crime-research.org/
Web site of the Computer Crime Research Center, a non-profit subsidiary of The American University, providing many useful security resources. A cleanly laid-out page indexes news, articles, analytical essays, and intervals, with links to Legislation, Seminars, other links, and a news archive.
The site offers a free newsletter and is searchable, and provides a fast way to keep up to date with what is going on in the rapidly-changing world of network security.
http://www.virusbtn.com/vb100/archives/products.xml?table
A comparative display of reviews of nearly three deozen antivirus products, as tested against Windows 2000, NT, RedHat Linux, XP Professional, and Netware 6.0. The results can be displayed in a variety of ways, by vedor or by platform, with a summary overview of the pass/fail capacity of the specific product, and a link to the vendor Website. The most interesting thing that comes out of this is that there is no single product which is 100% effective across all the platforms tested, although it is unlikely any single machine in a production environment would be using all of these operating systems at once.
This suggests that more than one antivirus product may be needed in a network running a variety of operating systems.
http://www.only4gurus.com/v2/index.asp
A somewhat ad-laden but nevertheless effectively arranged front end to information relating to Microsoft issues, linking mainly to Microsoft sites an an easily-used format.
http://members.bellatlantic.net/~mrscary/winupfaq.htm
A collection of Microsoft Knowledge Base articles addressing the problems of updating Windows automatically, with the major focus being Windows 98. Some additional links at the bottom of the page cover more general Windows 98 update and configuration issues.
http://www.cse.unsw.edu.au/~geoffo/humour/flattery.html
The main purpose of this site is just to make you smile -- you go there, enter your name in the dialog box, and the Web page proceeds to flatter you with uplifting comments. This is sufficiently amusing in itself to be diverting, but [as is often the case with this sort of thing] it represents the still surface water over a roiling point -- the degree to which software and hardware can mimic human responses when we interact with them.
http://windowsxp.homedns.org/xp/
The URL instances a blog with information, including downloads, relating to Windows 2000, XP, and 2003, along with scripts and some scrolling news. While interesting in its own right, the download available at the site: "The Tweaking Experience" serves as a guide to this activity for 2000 and XP alike, and is well worth the trip all by itself.
http://pugetsoundsoftware.com/askleo-blog/archives/000015.html
For Microsoft Windows vict- er, users, .DLL problems are a way of life. Here is a short article with a number of remedial steps you can take to try to work yourself out of the famous ".DLL Hell".
http://whitepapers.comdex.com/data/detail?id=1046792460_261&type=RES&src=KA_RES
A white paper covering what is involved in "Fixing Ethernet and Fast Ethernet Link Problems", which can be of aid in setting up trouble-shooting exercises and devising solutions for them.
http://www.3dcenter.org/artikel/cinefx/index_e.php
If you want to go into extremely deep analysis of the NVidia 30 graphics card's CineFX pixel engine, here are all the technical details based on the published patent. The point to hoist inboard here is how much is going on at the chip level to produce visual displays, of which we are completely unaware on a day-to-day basis.
http://news.bbc.co.uk/1/hi/sci/tech/3097904.stm
Here is the upside of the ID chip issue -- devices which can sense proximity to other devices, and emit alarms if they are moved [in particular, by an aspiring thief]. This is RFID in another guise, and again is part of the whole surge to making the world smart.
Like many such Janus-faced technologies, the question is: do we want the benefits sufficiently to put up with those drawbacks which cannot be mitigated? My suspicion, in this particular instance, is that the answer is "Yes, please!".
http://www.projectcensored.org/publications/2004/6.html
Summary of an article which views with alarm from a left-wing perspective the degree of concentration in the telecommunications industry, which in turn has the potential to throttle access to broad-band InterNet services. The alarm may well be real -- these companies have never acted in the public interest and represent corporate crassness at its most base.
But the idea that the whole Net is thus imperilled neglects the fact that dial-up still exists, and can easily continue doing so indefinitely. It may be "low and slow", but it still allows one to get the message out.
http://www.informationweek.com/story/showArticle.jhtml?articleID=14200065
Short article on the availability of different levels of AMD's "Opteron" competitor to Intel in the 64-bit computing arena. The top-end chip in the line costs over $3K in quantity lots [which price can buy one a pretty decent 32-bit desktop system these days], but the fact that the low-end chip in the same line costs under $700 means that if one wants it badly enough, 64-bit home computing is within an individual's grasp.
http://eletters.wnn.ziffdavis.com/zd/cts?d=75-57-1-1-618817-2443-1
Topping the previous story off, the above link fetches an article showing considerable details relating to the performance results of the Opteron chip,
http://www.informationweek.com/story/showArticle.jhtml?articleID=14200065
"East is East, and West is West/And never the twain shall meet" might have been Kipling's take on globalization, but the "small world" created by the InterNet means computers the world over face the same sort of security threats. A summary of the 2003 Global Information Security Survey from Information Week suggests both the nature of the threats which apply in this arena and some of the costs associated with their activation and cure.
http://www.technewsworld.com/perl/story/31502.html
One of my colleagues uses the tagline: "If you think education is expensive, try ignorance!". This article suggests he is literally right -- that workers with computers everywhere suffer economic losses at home and at work, because they don't know much about their machines. While I myself have been heard to whine "It shouldn't be this hard!", I also have a wealth of anecdotal evidence suggesting that most organizations suboptimize computer use by scrimping on training, when in fact the returns inherent in more educated computer workers should, if it works, greatly outweigh education's costs.
The need for extensive basic training, probably on a certification basis, appears hard to dispute.
http://www.computerworld.com/networkingtopics/networking/management/story/0,10801,84604,00.html
The concept of "autonomic computing" warms IT managers' hearts, even as it chills IT workers' souls. The capacity for systems to configure and repair themselves automatically bids to be more reliable than the current process, as well as being less labour-intensive, and less expensive. The article indicates that this trend is well-established, but will really begin paying off in the short- to medium-term future.
One wonders the degree to which knowledgeable IT technicians may be tempted to resist implementing such systems in their own economic self-interest. One also remembers the cheery riposte to those who feared a computer takeover: "You can always pull the plug!" -- well, now maybe we can't.
http://www.reuters.com/newsArticle.jhtml?type=topNews&storyID=3420519
In another development bringing SF closer to reality, the introduction of new display technologies which can project images essentially "in thin air" is discussed. Whether these devices are accepted or not depends on a number of rather complex issues, but there is a deeper factor to consider.
If, as I have ruminated previously, we are "making the world smart", then we need some way of being able to access this, and we can't hang a CRT or LCD display from every tree branch. This form of projective technology could have a major impact in, for example, making highway signage more interative and responsive to conditions. It could be the interface we need to relate to a vastly more information- and connection-rich world.
Which leaves the question: What will SF writers have left to talk about? Since this question was asked after the first moon landing, and there has been at least a small library wing of true SF published subsequently, I don't think there is any reason to worry on that score.
http://www.technewsworld.com/perl/story/31534.html
One of the major hurdles to a viable networked virtual reality has been developments of haptics technology, allowing the sense of touching/feeling to be extended beyond the individual body. The article dicusses research developments in this area, and suggests that while the initial enthusiasm about VR was [like just about every other IT/high tech enthusiasm] overblown, nevertheless we will be able to move forward in this area.
The real question: will we want to? Whether this becomes a limited niche market or the realization of reams of SF depends on consumer acceptance, itself often something which can be neither predicted nor reasoned with.
http://24hour.startribune.com/24hour/technology/story/986150p-6925945c.html
Short article on plans to build a supercomputing grid which would do for computational power what the WWWeb did for communication. One direction from which "personal supercomputing" is coming is the increasing capability of desktop systems, particularly when clustered together [the "Cray 1 in the basement" effect], but even as such systems approach the power of supercomputers in the past, so does the leading edge bleed forward.
Should this come about, the diffusion of computational power represented ought to have the same range of unexpected effects, both good and bad, that the introduction of the InterNet itself has thus far had.
http://www.baddteddy.com/tutorials/virus.htm
While the layout of this site is somewhat eye-straining, it is worth considering as a pedagogical resource, not only because of its importance, but also as an exercise in evaluating Web information. I was led to this by the "Infopacjets Gazette" listserv, which is intended for computer newbies, serving me as a valuable source of tutorials and other pointers for those just beginning to find where the "On" switch is [and I will confess without a blush that once in a while I find something directly useful on the Infopackets site].
Key questions to ask: is the technical content convincing? Does it agree or disagree with other major sites dealing with the same material? Does the particular website and the anonymous authorship increase or decrease credibility?
Sometimes a gem can be flawed, and yet spark our imaginations.
http://www.zdnet.com/anchordesk/stories/story/0,10738,2914622,00.html
Despite the concerns which have been expressed about its negative aspects, Radio Frequency ID tagging of merchandise is a beast which has not only slouched towards Bethelehem, but also is now sitting down eating soup with a long spoon. The commercial advantages of this technology are so great that we cannot realistically expect to hold it back, demonstrating once again how, when human values conflict with economic values, the former lose.
However depressing this may be as an overall prospect, it appears an aspect of this modern world which we cannot escape. Again, this is one of those "iceberg" technologies, towards which we are charging at full speed, frantically re-arranging the deck chairs as we go.
http://www.zdnet.com/anchordesk/stories/story/0,10738,2914622,00.html
Article claiming that those who assist in the propagation of viruses and worms are being penalized too harshly by the criminal justice system. I have already made my predelictions towards physical mutilation without anaesthesia as the proper remedy, so it should come as no surprise that I disagree vociferously with the author.
Virus and worm release represents a form of "one-to-many" crime, just as, for example, does a Ponzi scam. The individual loss in any case may not amount to much, but collectively the loss is enormous. It is exactly this loss which should be penalized [along with, of course, the loss of public trust which results], and harsh penalties are certainly supportable. The fact that some other people have been treated more leniently is, to my view, unfortunate, but it is no justification to treal all malware offenders that way.
Of course, given that I am a vengeful SOB, this opinion should not be all that surprising.
http://www.bernardbelanger.com/computing/NaDa/index.html
The nature of the Zen experience has, I think, not been applied with sufficient rigour [which of course begs the question whether "Zen" and "rigour" are at all compatible, in, say, the way in which "pasta" and "rigour" are compatible] to the computing world. Where, oh where, is the computing equivalent of Robert Pirseg's Zen and the Art of Motorcycle Maintenance?
If the answer to this is "We doan' need no steenkin' manuals" then I think, without let or hindrance, that the point I have been endeavouring to make here emerges from the foam of fractious disputation like the peak of Mount Fuji poking through the overcast on a sunny Friday morning.
Well, then, it may be objected, surely it is appropriate to request some automated aids towards this end. Surely it it, and just as shurely, Shirly, the download at this site will prove as elegant as it does functional in such realization.
Like the Big Red Button Which Does Nothing, you really have to see this for yourself. And that, or its converse, is the essence of the Zen experience, to the extent to which I can claim that the essence of the Zen experience can be understood.
http://www.savetheservers.com/
Man's inhumanity to man is, perhaps, only exceeded by man's inhumanity to servers. Graphic pictures which one cannot view with a stable stomach, no matter how hard one might try.
http://www.backup-software-reviews.com/
A true network professional never backs up data -- he can restore anything! On the other hand, backing up is one of those mundane teeth-flossing activities which can be the salvation of a network, and if you are being paid the big bucks, you better have the system backed up.
There are lots of backup products out there: this article reviews them, and selects the five best. The review methodology is also available from this site.
Now make sure you write down the URL, just in case your browser chokes on it....
http://www.masternewmedia.org/2003/08/28/blogs_as_instruments_for_effective_pr.htm
Somewhat overheated article discussing what WebLogs are, and for what they are most useful. Concludes they can be valuable sources of PR, but also contains a grab bag of links to articles related to blogging, plus books available for purchace on various aspects of information management.
http://newsforge.com/article.pl?sid=03/08/27/132243
Somewhat cynical article suggesting that IT outsourcing will continue to grow, and represents something irreversible. The only area of IT which is "safe" is direct support of the hardware infrastructure -- and to the degree to which this can be managed remotely, not even this niche is safe.
There is a high irony in IT workers becoming victims of a competitive environment which they themselves were instrumental in creating.
http://www.edbott.com/protect_your_pc.htm
Ed Bott is a byname in the PC world; in this article he outlines the elements of a comprehensive security policy for Windows 2000 and Windows XP machines. Another section on his site:
http://www.edbott.com/windows_tips.htm
Offers a whole host [practically a plethora, though certainly not a surfeit] of articles covering Internet Explorer, Windows Tweaks, Outlook Express, Hardware
Troubleshooting / Maintenance, Privacy / Security, and Windows Update.
http://www.smh.com.au/articles/2003/09/04/1062548967124.html
Physical security is sometimes neglected in discussions of computer security; this article shows why it should receive some emphasis. The idea of someone stealing an entire mainframe is like something out of a movie [except that I don't think I have ever seen this particular stunt in anything I have seen].
http://www.dansdata.com/sbs3.htm
If you want to reduce your computer to a heap of junk [something I can sometimes do simply by looking at it], this article will give all the gory details. While part of a site generally devoted to computer humour, a discussion like this can be useful to reference for troubleshooting or "what not to do".
Here are three white papers on various aspects of networks and timkeeping:
The Importance of Network Time Synchronization
http://whitepapers.comdex.com/data/detail?id=1051627365_102&type=RES&src=KA_RES
The Five Dangers of Poor Network Timekeeping http://whitepapers.comdex.com/data/detail?id=1051627365_106&type=RES&src=KA_RES
Synchronization Essentials of VoIP http://whitepapers.comdex.com/data/detail?id=1051627365_514&type=RES&src=KA_RES
http://whitepapers.comdex.com/data/detail?id=1062090016_670&type=RES&src=KA_RES
A white paper titled "Fast Path to Secure Systems Architectures and Network Designs", directed towards enterprise/large-scale security, again from a source which has a lot of experience with this, both good and bad.
http://www.pcmag.com/article2/0,4149,1228782,00.asp
An overview of what we can expect in both enterprise and home networking in the near future, incuding hardware, software, and standards. The importance of networking for IT growth shows no signs of diminishing, so it is useful to have a look at where we might be going.
http://www.eweek.com/print_article/0,3048,a=58583,00.asp
"We will ship no operating system before its time." now appears to be a Microsoft mantra; this article speculates on reasons for the pullback. In addition, there is a box of related links to developments on this OS front.
One wonders if users have been polled on the desirability of a new OS, when they are still getting used to the old one. There might be even more reason for delay if this basic fact were hoisted inboard at Redmond.
http://www.extremetech.com/print_article/0,3998,a=58657,00.asp
From my viewpoint, this article with a title originated by Walt Kelly strikes the nail solidly on its most puissant point. Linux developers need to focus on application production, in a finished manner, for desktop acceptance to be widespread.
I receive a Linux listserv from LockerGnome [itself a source of many useful items in this blog], and a continuing, multi-part series [up to #7 and still going] wherein the editor wrestles with installation problems in getting what should be a simple and basic piece of software to work in a manner recalling an XFL game in the height of winter makes this point plain. When I read this latter listserv, I sat back and wondered why anyone would want to go through such a frustrating, hair-tearing experince.
By the sounds of this article, I am not the only one so wondering.
http://www.infoworld.com/infoworld/article/03/08/29/34FElinux_1.html
Another article on the costs of switching to Linux, with a number of interesting observations, including one suggesting Linux is not always the cost solution. Points out as well that TCO needs to be related to ROI, rather than considered as a singular entity. If ROI rises faster than TCO, and there is a causal relationship, then "cheaper" is not always "better".
http://www.eweek.com/article2/0,3959,1234349,00.asp
One more look at the cost issues surrounding Linux, this time focussing on the out-of-pocket costs involved in deploying Red Hat Linux.
http://www.pcworld.com/howto/article/0,aid,111652,00.asp
One of my major gripes about the Windows OS is the bland assumption that after using the system for a couple of years, a re-installation to restore stability and function is required. The words I would use to indicate my opinion of this cannot be printed in a posting directed at a professional audience, but they can be categorized as "sulfuric".
The purpose of a computer is to do things, and software and the OS are the means to that end. We should, as a matter of right, be able to install software on a regular basis without having to worry about OS stability. But since we cannot, this detailed article on how to reinstall Windows may prove to be a lifesafer.
Personally, I will accept a lot of instability, because to reinstall and reconfigure my system would take, at a minimum, a man-week, which makes root-canal work without anesthetic during a hurricane while renegotiating your mortgage look pleasant in contrast.
http://news.bbc.co.uk/1/hi/technology/3189537.stm
The commonplace that gaming graphics is one of the major drivers of computer hardware development is challenged in this article, which suggests that graphics have evolved sufficiently that by themselves, they will no longer sell a game.
I am somewhat of two minds about this [being an avid purchaser of games who never actually plays them]. I agree that a game has to have something more than graphics in order to be successful; I also think one of the major issues with most games is that they interpose difficulty for most computer users instead of enhancing the gaming experience. Many potential users might be drawn in by a game which, like The Sims, is less a form of puzzle-solving or reflex-driven competition than an immersive experience.
In that case, I hardly think that the level of realism offered by what I have seen of current games represents the graphic ultimate -- which to me will only happen when I play a game and have a visual experience equivalent to watching a movie on TV. The crucial point will have been reached when instead of thinking "My how realistic that water looks!", I think "There is a stream to my left, and it really looks cold and dangerous".
And I don't think we are "there" yet -- but we are getting close, and should be at that level in 3 - 5 years tops. Then what are we going to do for hardware excitement?
This article makes the extremely good point that technology should be simple and cheap, and how hard and expensive it can be to bring that about. Certainly IT can never become a general purpose instrument without being much simpler than it is [composed of at least two parts -- the expertise to determine what/how to do, and the time in which to do it].
I have one reservation about this article: when it recommends using the past as a way of making the future familiar. My fear is that this prevents the sort of mental liberation that new tools used in new ways have the potential to engender. Still and all, this is worth reading and considering.
Any time I look at my heap of malfunctioning PCs and bleat "It shouldn't be this hard!", I become an object lesson for the sort of thinking this article represents.
http://www.petri.co.il/index.htm
The formal title of this site is a bit misleading, since it holds hundreds of articles on 32-bit Windows variants, as well as information on patches/service packs, tutorials, links to other resources, an annotated booklist, and, naturallement, some information on becoming a MCSE.
There is a lot on this site, and the author has gone to great pains to explain how to access all of it in the initial "Welcome" screen. I experienced a JavaScript error on opening the page, but ymmv, and in any case, there still is one humongous amount of relevant information available.
http://news.com.com/2100-1024_3-5069571.html?tag=fd_top
While this article is dedicated to comparing the Google and Overture Net serarch engines, the "incidental" fact that the databases contain Web documents in numbers exceeding 3 billion gives some concept of how the Web has grown. Since the standard take on the size of the Web is that search estimates underestimate its size [in part because of issues involving the "invisible Web", and in part because some Web pages are set up to prevent indexing in the first place], there is more to the iceberg than one might think.
On the other hand, answering the question of how many of those 3 billion documents are of any use is something else again....
http://www.intuitor.com/moviephysics/
While not directly related to IT [it deals with physics as handled in the movies], this is an amusing site to visit, and does discuss some general educational issues. An equivalent site for computer hardware might be a useful effort, though some errors relating to computers are also handled at standard general blooper sites.
http://www.businessweek.com/bwdaily/dnflash/aug2003/nf20030827_6640_db016.htm
Brief article suggesting that there are more solid signs for optimism in terms of an economic recovery, which in turn will result in more innovation. Should this in fact come to pass, of course, it does not mean that those areas of the IT industry which were wrecked in the bubble collapse will suddenly come roaring back. A more likely prospect: a majority of the new jobs will demand familiarity with new technology, not a background in what's here now.
On the other hand, the whole IT infrastructure issue is one which won't go away, and there are limits to how much of this can be parcelled out to India. If an economic upturn creates a need for a more robust and extensive network mesh, we may yet see some recovery in some [but not all] traditional IT specialties.
http://www.ingrian.com/resources/index.html#wp
While the initial inpetus for this posting is a white paper on "Five Threats to Data Security", this site also contains other related papers, plus specifications for the company's products and solution/fact sheets of how these can be applied.
Some Web resources dealing with Linux commands:
Alphabetical Directory of Linux Commands
http://www.onlamp.com/linux/cmd/
LinuxCommand.org
http://gd.tuwien.ac.at/linuxcommand.org/
Linux Commands and Shell Commands Libraries
http://gd.tuwien.ac.at/linuxcommand.org/
Some Useful Linux Commands
http://www.er.uqam.ca/nobel/r10735/unixcomm.html