Internet 2018-02-02T01:36:09+00:00

Internet

ARPANET, the Beginning of the Internet

Libertarians often cite the internet as a case in point that liberty is the mother of innovation. Opponents quickly counter that the internet was a government program, proving once again that markets must be guided by the steady hand of the state. In one sense the critics are correct, though not in ways they understand. The internet indeed began as a typical government program, the ARPANET, designed to share mainframe computing power and to establish a secure military communications network. Advanced Research Projects Agency (ARPA), now DARPA, of the United States Department of Defense, funded the original network.

Of course the designers could not have foreseen what the (commercial) internet has become. Still, this reality has important implications for how the internet works — and explains why there are so many roadblocks in the continued development of online technologies. It is only thanks to market participants that the internet became something other than a typical government program: inefficient, overcapitalized, and not directed toward socially useful purposes.

Read More...

In fact, the role of the government in the creation of the internet is often understated. The internet owes its very existence to the state and to state funding. The story begins with ARPA, created in 1957 in response to the Soviets’ launch of Sputnik and established to research the efficient use of computers for civilian and military applications.

During the 1960s, the RAND Corporation had begun to think about how to design a military communications network that would be invulnerable to a nuclear attack. Paul Baran, a RAND researcher whose work was financed by the Air Force, produced a classified report in 1964 proposing a radical solution to this communication problem. Baran envisioned a decentralized network of different types of “host” computers, without any central switchboard, designed to operate even if parts of it were destroyed. The network would consist of several “nodes,” each equal in authority, each capable of sending and receiving pieces of data.

Each data fragment could thus travel one of several routes to its destination, such that no one part of the network would be completely dependent on the existence of another part. An experimental network of this type, funded by ARPA and thus known as ARPANET, was established at four universities (using 4 computers) in 1969.

From Wikipedia:

The first successful message on the ARPANET was sent by UCLA student programmer Charley Kline, at 10:30 pm on 29 October 1969, from Boelter Hall 3420. Kline transmitted from the university’s SDS Sigma 7 Host computer to the Stanford Research Institute’s SDS 940 Host computer. The message text was the word login; on an earlier attempt the l and the o letters were transmitted, but the system then crashed. Hence, the literal first message over the ARPANET was lo. About an hour later, after the programmers repaired the code that caused the crash, the SDS Sigma 7 computer effected a full login. The first permanent ARPANET link was established on 21 November 1969, between the IMP at UCLA and the IMP at the Stanford Research Institute. By 5 December 1969, the entire four-node network was established.

Researchers at any one of the four nodes could share information, and could operate any one of the other machines remotely, over the new network. (Actually, former ARPA head Charles Herzfeld says that distributing computing power over a network, rather than creating a secure military command-and-control system, was the ARPANET’s original goal, though this is a minority view.) Al Gore was not present!

By 1972, the number of host computers connected to the ARPANET had increased to 37. Because it was so easy to send and retrieve data, within a few years the ARPANET became less a network for shared computing than a high-speed, federally subsidized, electronic post office. The main traffic on the ARPANET was not long-distance computing, but news and personal messages.

In 1972, BBN’s Ray Tomlinson introduces network email as the Internetworking Working Group (INWG) forms to address the need for establishing standard protocols.

But Arpanet had a problem: it wasn’t mobile. The computers on Arpanet were gigantic by today’s standards, and they communicated over fixed links. That might work for researchers, who could sit at a terminal in Cambridge or Menlo Park – but it did little for soldiers deployed deep in enemy territory. For Arpanet to be useful to forces in the field, it had to be accessible anywhere in the world.

Picture a jeep in the jungles of Zaire, or a B-52 miles above North Vietnam. Then imagine these as nodes in a wireless network linked to another network of powerful computers thousands of miles away. This is the dream of a networked military using computing power to defeat the Soviet Union and its allies. This is the dream that produced the internet.

Making this dream a reality required doing two things. The first was building a wireless network that could relay packets of data among the widely dispersed cogs of the US military machine by radio or satellite. The second was connecting those wireless networks to the wired network of Arpanet, so that multimillion-dollar mainframes could serve soldiers in combat. “Internetworking,” the scientists called it.

Internetworking is the problem the internet was invented to solve. It presented enormous challenges. Getting computers to talk to one another – networking – had been hard enough. But getting networks to talk to one another – internetworking – posed a whole new set of difficulties, because the networks spoke alien and incompatible dialects. Trying to move data from one to another was like writing a letter in Mandarin to someone who only knows Hungarian and hoping to be understood. It didn’t work.

In response, the architects of the internet developed a kind of digital Esperanto: a common language that enabled data to travel across any network. In 1974, two Arpa researchers named Robert Kahn and Vint Cerf (the duo said by many to be the Fathers of the Internet) published an early blueprint. Drawing on conversations happening throughout the international networking community, they sketched a design for “a simple but very flexible protocol”: a universal set of rules for how computers should communicate.

These rules had to strike a very delicate balance. On the one hand, they needed to be strict enough to ensure the reliable transmission of data. On the other, they needed to be loose enough to accommodate all of the different ways that data might be transmitted.

“It had to be future-proof,” Cerf tells me. You couldn’t write the protocol for one point in time, because it would soon become obsolete. The military would keep innovating. They would keep building new networks and new technologies. The protocol had to keep pace: it had to work across “an arbitrarily large number of distinct and potentially non-interoperable packet switched networks,” Cerf says – including ones that hadn’t been invented yet. This feature would make the system not only future-proof, but potentially infinite. If the rules were robust enough, the “ensemble of networks” could grow indefinitely, assimilating any and all digital forms into its sprawling multithreaded mesh.

Eventually, these rules became the lingua franca of the internet. But first, they needed to be implemented and tweaked and tested – over and over and over again. There was nothing inevitable about the internet getting built. It seemed like a ludicrous idea to many, even among those who were building it. The scale, the ambition – the internet was a skyscraper and nobody had ever seen anything more than a few stories tall. Even with a firehose of cold war military cash behind it, the internet looked like a long shot.

In 1973, Global networking becomes a reality as the University College of London (England) and Royal Radar Establishment (Norway) connect to ARPANET. The term Internet is born. A year later, the first Internet Service Provider (ISP) is born with the introduction of a commercial version of ARPANET, known as Telenet.

Then, in the summer of 1976, it started working.

If you had walked into Rossotti’s beer garden on 27 August 1976, you would have seen the following: seven men and one woman at a table, hovering around a computer terminal, the woman typing. A pair of cables ran from the terminal to the parking lot, disappearing into a big grey van.

Inside the van were machines that transformed the words being typed on the terminal into packets of data. An antenna on the van’s roof then transmitted these packets as radio signals. These signals radiated through the air to a repeater on a nearby mountain top, where they were amplified and rebroadcast. With this extra boost, they could make it all the way to Menlo Park, where an antenna at an office building received them.

It was here that the real magic began. Inside the office building, the incoming packets passed seamlessly from one network to another: from the packet radio network to Arpanet. To make this jump, the packets had to undergo a subtle metamorphosis. They had to change their form without changing their content. Think about water: it can be vapor, liquid or ice, but its chemical composition remains the same. This miraculous flexibility is a feature of the natural universe – which is lucky, because life depends on it.

The flexibility that the internet depends on, by contrast, had to be engineered. And on that day in August, it enabled packets that had only existed as radio signals in a wireless network to become electrical signals in the wired network of Arpanet. Remarkably, this transformation preserved the data perfectly. The packets remained completely intact.

So intact, in fact, that they could travel another 3,000 miles to a computer in Boston and be reassembled into exactly the same message that was typed into the terminal at Rossotti’s. Powering this internetwork odyssey was the new protocol cooked up by Kahn and Cerf. Two networks had become one. The internet worked.

An old image of Rossotti’s, one of the birthplaces of the internet. Photograph: Courtesy of the Alpine Inn Beer Garden, formerly Rossotti’s

“There weren’t balloons or anything like that,” Don Nielson tells me. Now in his 80s, Nielson led the experiment at Rossotti’s on behalf of the Stanford Research Institute (SRI), a major Arpa contractor. Tall and soft-spoken, he is relentlessly modest; seldom has someone had a better excuse for bragging and less of a desire to indulge in it. We are sitting in the living room of his Palo Alto home, four miles from Google, nine from Facebook, and at no point does he even partly take credit for creating the technology that made these extravagantly profitable corporations possible.

1976: Queen Elizabeth II hits the “send button” on her first email.

The internet was a group effort, Nielson insists. SRI was only one of many organizations working on it. Perhaps that’s why they didn’t feel comfortable popping bottles of champagne at Rossotti’s – by claiming too much glory for one team, it would’ve violated the collaborative spirit of the international networking community. Or maybe they just didn’t have the time. Dave Retz, one of the researchers at Rossotti’s, says they were too worried about getting the experiment to work – and then when it did, too worried about whatever came next. There was always more to accomplish: as soon as they’d stitched two networks together, they started working on three – which they achieved a little over a year later, in November 1977.

Over time, the memory of Rossotti’s receded. Nielson himself had forgotten about it until a reporter reminded him 20 years later. “I was sitting in my office one day,” he recalls, when the phone rang. The reporter on the other end had heard about the experiment at Rossotti’s, and wanted to know what it had to do with the birth of the internet. By 1996, Americans were having cybersex in AOL chatrooms and building hideous, seizure-inducing homepages on GeoCities. The internet had outgrown its military roots and gone mainstream, and people were becoming curious about its origins. So Nielson dug out a few old reports from his files, and started reflecting on how the internet began. “This thing is turning out to be a big deal,” he remembers thinking.

What made the internet a big deal is the feature Nielson’s team demonstrated that summer day at Rossotti’s: its flexibility. Forty years ago, the internet teleported thousands of words from the Bay Area to Boston over channels as dissimilar as radio waves and copper telephone lines. Today it bridges far greater distances, over an even wider variety of media. It ferries data among billions of devices, conveying our tweets and Tinder swipes across multiple networks in milliseconds.

The fact that we think of the internet as a world of its own, as a place we can be “in” or “on” – this too is the legacy of Don Nielson and his fellow scientists. By binding different networks together so seamlessly, they made the internet feel like a single space. Strictly speaking, this is an illusion. The internet is composed of many, many networks: when you go to Google’s website, your data must traverse a dozen different routers before it arrives. But the internet is a master weaver: it conceals its stitches extremely well. We’re left with the sensation of a boundless, borderless digital universe – cyberspace, as we used to call it. Forty years ago, this universe first flickered into existence in the foothills outside of Palo Alto, and has been expanding ever since.

As parts of the ARPANET were declassified, commercial networks began to be connected to it. Any type of computer using a particular communications standard, or “protocol,” was capable of sending and receiving information across the network. The design of these protocols was contracted out to private universities such as Stanford and the University of London, and was financed by a variety of federal agencies. The major thoroughfares or “trunk lines” continued to be financed by the Department of Defense.

1983: The Domain Name System (DNS) establishes the familiar .edu, .gov, .com, .mil, .org, .net, and .int system for naming websites. This is easier to remember than the previous designation for websites, such as 123.456.789.10.

By the early 1980s, private use of the ARPA communications protocol — what is now called “TCP/IP” — far exceeded military use. In 1984 the National Science Foundation assumed the responsibility of building and maintaining the trunk lines or “backbones.” (ARPANET formally expired in 1989; by that time hardly anybody noticed). The NSF’s Office of Advanced Computing financed the internet’s infrastructure from 1984 until 1994, when the backbones were privatized.

1984: William Gibson, author of “Neuromancer,” is the first to use the term “cyberspace.”

In short, both the design and implementation of the internet have relied almost exclusively on government dollars. The fact that its designers envisioned a packet-switching network has serious implications for how the internet actually works. For example, packet switching is a great technology for file transfers, email, and web browsing but not so good for real-time applications like video and audio feeds, and, to a lesser extent, server-based applications like webmail, Google Earth, SAP, PeopleSoft, and Google Spreadsheet.

Furthermore, without any mechanism for pricing individual packets, the network is overused, like any public good. Every packet is assigned an equal priority. A packet containing a surgeon’s diagnosis of an emergency medical procedure has exactly the same chance of getting through as a packet containing part of Coldplay’s latest single or an online gamer’s instruction to smite his foe.

Because the sender’s marginal cost of each transmission is effectively zero, the network is overused, and often congested. Like any essentially unowned resource, an open-ended packet-switching network suffers from what Garrett Hardin famously called the “Tragedy of the Commons.”

In no sense can we say that packet-switching is the “right” technology. One of my favorite quotes on this subject comes from the Netbook, a semi-official history of the internet:

“The current global computer network has been developed by scientists and researchers and users who were free of market forces. Because of the government oversight and subsidy of network development, these network pioneers were not under the time pressures or bottom-line restraints that dominate commercial ventures. Therefore, they could contribute the time and labor needed to make sure the problems were solved. And most were doing so to contribute to the networking community.”

In other words, the designers of the internet were “free” from the constraint that whatever they produced had to satisfy consumer wants.

We must be very careful not to describe the internet as a “private” technology, a spontaneous order, or a shining example of capitalistic ingenuity. It is none of these. Of course, almost all of the internet’s current applications — unforeseen by its original designers — have been developed in the private sector. (Unfortunately, the original web and the web browser are not among them, having been designed by the state-funded European Laboratory for Particle Physics (CERN) and the University of Illinois’s NCSA.)

The World Wide Web wasn’t created until 1989, 20 years after the first “Internet” connection was established and the first message sent.

1990: Tim Berners-Lee, a scientist at CERN, the European Organization for Nuclear Research, develops HyperText Markup Language (HTML). This technology continues to have a large impact on how we navigate and view the Internet today.

1991: CERN introduces the World Wide Web to the public.

1992: The first audio and video are distributed over the Internet. The phrase “surfing the Internet” is popularized.

And today’s internet would be impossible without the heroic efforts at Xerox PARC and Apple to develop a useable graphical user interface (GUI), a lightweight and durable mouse, and the Ethernet protocol. Still, none of these would have been viable without the huge investment of public dollars that brought the network into existence in the first place.

Now, it is easy to admire the technology of the internet. I marvel at it every day. But technological value is not the same as economic value. That can only be determined by the free choice of consumers to buy or not to buy. The ARPANET may well have been technologically superior to any commercial networks that existed at the time, just as Betamax may have been technologically superior to VHS, the MacOS to MS-DOS, and Dvorak to QWERTY. (Actually Dvorak wasn’t.) But the products and features valued by engineers are not always the same as those valued by consumers. Markets select for economic superiority, not technological superiority (even in the presence of nefarious “network effects,” as shown convincingly by Liebowitz and Margolis).

Libertarian internet enthusiasts tend to forget the fallacy of the broken window. We see the internet. We see its uses. We see the benefits it brings. We surf the web and check our email and download our music. But we will never see the technologies that weren’t developed because the resources that would have been used to develop them were confiscated by the Defense Department and given to Stanford engineers. Likewise, I may admire the majesty and grandeur of an Egyptian pyramid, a TVA dam, or a Saturn V rocket, but it doesn’t follow that I think they should have been created, let alone at taxpayer expense.

What kind of global computer network would the market have selected? We can only guess. Maybe it would be more like the commercial online networks such as Comcast or MSN, or the private bulletin boards of the 1980s. Most likely, it would use some kind of pricing schedule, where different charges would be assessed for different types of transmissions.

The whole idea of pricing the internet as a scarce resource — and bandwidth is, given current technology, scarce, though we usually don’t notice this — is ignored in most proposals to legislate network neutrality, a form of “network socialism” that can only stymie the internet’s continued growth and development. The net neutrality debate takes place in the shadow of government intervention. So too the debate over the division of the spectrum for wireless transmission. Any resource the government controls will be allocated based on political priorities.

Let us conclude: yes, the government was the founder of the internet. As a result, we are left with a panoply of lingering inefficiencies, misallocations, abuses, and political favoritism. In other words, government involvement accounts for the internet’s continuing problems, while the market should get the credit for its glories.

Surveillance State, Skype, Google, Netflix, and the Internet of Thinks all Predicted over 50 Years Ago

Though invented by the government for the military, the internet would end up being useful to the US military, but it didn’t really take off until it became civilianized and commercialized – a phenomenon that the Arpa researchers of the 1970s could never have anticipated. “Quite honestly, if anyone would have said they could have imagined the internet of today in those days, they’re lying,” says Don Nielson who was at the Rossotti’s in ’76. What surprised him most was how “willing people were to spend money to put themselves on the internet”. “Everybody wanted to be there,” he says. “That was absolutely startling to me: the clamor of wanting to be present in this new world.”

Perhaps to Neilson’s surprise, someone came close to imagining todays internet. An article in a 1965 edition of Eagle, a British comic book, predicted the arrival of the Internet with stunning accuracy, including services similar to Skype, Netflix, Kindle, Google, and even the “Internet of things” where every home appliance is linked to the world wide web. The article, entitled Computers for Everyone, predicted “world knowledge at your fingertips….as early as the 1990’s. …How would you like to be able to solve any mathematical problem in a fraction of a second: summon any page of any book or newspaper instantly before your eyes: have all factual information known to man at your own fingertips – all without leaving your own living room? This fantastic dream of scientific achievement may come true by the 1990s if a plan now being worked on by top scientists in this country and the U.S.A. is successful,” states the article.

“Your TV set, your telephone, your electricity and gas meters, and your typewriter, tape-recorder and record player. All these things will be as out of date as the gas-lamp is today, for the computer will control all power supplies to your house, your videophone link and multi-channel TV signal,” states the article. The piece goes on to assert that the miniaturization process will solve the problem of computers being the size of rooms, while the “installation of the complex nationwide network of connections between the computers” will be the biggest challenge. The arrival of high speed fiber optic connections is even predicted when the article speaks of a new system that, “carries thousands of times more information than a cable at close to the speed of light.”

The article predicted at least seven fundamental aspects of the Internet, some of which are still only in their early stages today, a full four years before the very first arcane Internet-style communications were even tested, three decades before the Internet became accessible for the general public, and four decades before we saw services like Skype, Netflix and so-called ‘smart’ products.

  • “World knowledge at your fingertips” (the search engine).
  • “Summon any page of any book or newspaper instantly before your eyes” (Kindle).
  • “The computer will control all power supplies to your house” (smart home, Internet of things).
  • “Videophone” (Skype).
  • “Multi-channel TV signal” (Netflix, Internet TVs).
  • Computers/Internet to replace “Tape recorder and record player” (Spotify, iTunes).
  • Network to operate at speed of light (fiber optic).

All of these were predicted more than 50 years ago by a children’s comic book! Weird, but true.

The Atlantic, Nov. 1967 Issue

Back in 1967, the ARPANET was still another two years away from making its first connection. But Arthur Miller (a law professor at the University of Michigan, not the playwright) foresaw the dangers of networked computing, an irresistible temptation for the development of a surveillance state by any tech-savvy government left unchecked by its people.

Miller describes a dystopian world where computers can store vast amounts of personal, medical and financial data. He warns that while this information could prove incredibly useful, it could easily become vulnerable to nefarious entities in the government, private industry, or even individuals. “Even the most innocuous of centers,” Miller writes, “could provide the ‘foot in the door’ for the development of an individualized computer-based federal snooping system.”

There are further dangers. The very existence of a National Data Center may encourage certain federal officials to engage in questionable surveillance tactics. For example, optical scanners — devices with the capacity to read a variety of type fonts or handwriting at fantastic rates of speed — could be used to monitor our mail. By linking scanners with a computer system, the information drawn in by the scanner would be converted into machine-readable form and transferred into the subject’s file in the National Data Center.

Then, with sophisticated programming, the dossiers of all of the surveillance subject’s correspondents could be produced at the touch of a button, and an appropriate entry — perhaps “associates with known criminals” — could be added to all of them. As a result, someone who simply exchanges Christmas cards with a person whose mail is being monitored might find himself under surveillance or might be turned down when he applies for a job with the government or requests a government grant or applies for some other governmental benefit. An untested, impersonal, and erroneous computer entry such as “associates with known criminals” has marked him, and he is helpless to rectify the situation. Indeed, it is likely that he would not even be aware that the entry existed.

These tactics, as well as the possibility of coupling wiretapping and computer processing, undoubtedly will be extremely attractive to overzealous law-enforcement officers. Similarly, the ability to transfer into the National Data Center quantities of information maintained in nonfederal files — credit ratings, educational information from schools and universities, local and state tax information, and medical records — will enable governmental snoopers to obtain data that they have no authority to secure on their own.

By the end of his article, Miller calls for legislation to protect the data of American citizens. Miller wasn’t the only one calling for privacy legislation around this time. The following year, Paul Armer of the RAND Corporation would testify in front of a Senate subcommittee raising some of the very same concerns about the emergence of a snooper society. In fact, I wouldn’t be surprised if Armer read Miller’s article. Miller’s article appeared in the November 1967 issue of The Atlantic, over 50 years ago! But needless to say, that art would fit in nicely with the newsstands of today. Just check out the latest issue of The New Yorker.

Source:

Chronological History of Internet Related Events

 

Obama Does Not Renew the IANA Contract Thus Opening ICANN & the Internet to Eventually Fall Under UN Jurisdiction & Toward Global Technocracy

Obama cut Internet Corporation for Assigned Names and Numbers (ICANN) loose on September 30, 2016 by letting the Internet Assigned Numbers Authority (IANA) contract expire without being renewed. After expiration, we forever lost the right to renew the contract again. ICANN is a non-profit organization exclusively run by Technocrats. As such, it is an apolitical body that is happy to serve whatever form of governance exists as long as funding is received and salaries are paid. To a Technocrat, a world run by science and technology is better than any other form of governance. That Technocrats have played a ...
Read More

Orwellian Pokemon Go app released

Pokemon Go app released in most regions of the world by Niantic, an internal start-up of Google, the NSA-linked Big Brother company. Even now Google remains one of Niantic’s major backers. Niantic was founded by John Hanke, who also founded Keyhole, Inc., the mapping company which was created with seed money from In-Q-Tel, the CIA’s venture capital arm, and which was eventually rolled into Google Maps. In just the first two trading days after the game’s release, Nintendo’s market value rose a staggering $7.5 billion. The app requires an excessive amount of permissions on a user’s device, including the ability to read your contacts, find ...
Read More

Draconian UN Agenda 2030 is Adopted by UN Membership: A Recipe for Global Socialism & Totalitatianism

The United Nations and its mostly autocratic member regimes have big plans for your life, your children, your country, and your world. And those plans are not limited to the coercive “climate” agreement recently concluded in Paris. While the establishment media in the United States was hyping ISIS, football, and of course “global warming,” virtually every national government/dictatorship on the planet met at the 70th annual General Assembly at UN headquarters in New York to adopt a draconian 15-year master plan for the planet. Top globalists such as former NATO chief Javier Solana, a socialist, are celebrating the plan, ...
Read More

Facebook Launched a Secret Experiment with Cornell to Manipulate the Emotions of 689,003 Users

It’s become farcical. Whoever we ask, nobody seems to know anything. Did the study have ethical approval? First the answer was yes. Then it was no. Then it was maybe. Then it was no again. Was it funded by the US army? First the university said yes. Then it said no, without explanation. Why did the scientific journal not state whether the study was ethically approved, as required by its own policy? Sorry, editor Susan Fiske told me, I’m too busy to answer that question. I’m referring of course to the study published last week by the Proceedings of ...
Read More

Top Secret Document reveals NSA Allowed Full Access to Google, Facebook, Apple, etc. without Court Order

NSA Whistleblower, Edward Snowden, revealed that most large online data services providers participate in an NSA program to sweep up all user data into NSA computers for "Big Data" analysis and data mining. The companies revealed are Microsoft (9/11/07), Yahoo (3/12/08), Google (1/14/09), Facebook (6/3/09), Paltalk (12/7/09), YouTube (9/24/10), Skype (2/6/11), AOL (3/31/11), Apple (added Oct, 2012—one year after Steve Jobs died, replaced by former IBMer Timothy D. Cooke AT&T also participated. From Stellar Wind to PRISM, Boundless Informant to EvilOlive, the NSA spying programs are shrouded in secrecy and rubber-stamped by secret opinions from a court that meets ...
Read More

Whistleblower Edward Snowden Reveals that through PRISM, the NSA Began Spying on Americans through ‘Social’ Providers

On Jun. 06, 2013, thanks to NSA Whistleblower Edward Snowden, we learned that all the large online "social" providers were creations and certainly tools of their U.S. government spy state cronies in a program called PRISM. This strategy was hatched during the Clinton administration by his spy master advisor, Harvard law professor James P. Chandler, later Leader Technologies' patent attorney. AT&T played too. What is Prism? This one is pretty tricky to answer. According to the original leaked slides, Prism is a Us government-run programme for accessing vast swathes of data from some of the world's biggest and most ...
Read More

NSA Director James Clapper Lies to Congress about NSA Spying on Americans

On Mar. 12, 2013, NSA Director James R. Clapper lied to Senator Ron Wyden (D-Ore) and Congress: Sen. Wyden: "Does the NSA collect any type of data at all on millions, or hundreds of millions, of Americans?" Gen. Clapper replied untruthfully: "No, sir," rubbing his head. Sen. Wyden asked, "It does not?" Still rubbing his head, Clapper hedged, knowing the NSA was engaged in high crimes and he was covering it up: "Not wittingly. There are cases where they could inadvertently, perhaps, collect, but not wittingly." On Jun. 06, 2013, nine weeks later, NSA whistleblower Edward Snowden revealed that ...
Read More

U.S. Army Sponsored Artificial Intelligence Surveillance System Attempts to Predict The Future – ‘Minority Report’ Style

In something that looks straight out of the CBS show "Person of Interest", the science website Phsy.org is reporting on a potentially important breakthrough from researchers at Carnegie Mellon. In research sponsored by the United States Army Research Laboratory, the Carnegie Mellon researchers presented an artificial intelligence system that can watch and predict what a person will 'likely' do in the future using specially programmed software designed to analyze various real-time video surveillance feeds.  The system can automatically identify and notify officials if it recognized that an action is not permitted, detecting what is described as anomalous behaviors. According to the paper, one such ...
Read More

Security Week computer security analyst Jesus Oquendo published an article titled “Microsoft, the CIA and NSA Collude to Take Over the Internet”

Clever security researchers have uncovered the biggest security Coup d'état on the planet. Microsoft, the NSA, the CIA have all been colluding to create the most bloated covert piece of malware known to exist for 5 years [1] undetected. Microsoft decided somewhere along 2006 and 2007, that it was willing to throw away half of their market share (128 billion) in allowing this to occur [2]. Let us begin to analyze how this occurred. Obviously the NSA, CIA and others involved had to determine a mechanism to get the right talent hired at Microsoft. Not a big deal, the ...
Read More

Patent Office Director David J. Kappos Ordered an Unprecedented Third Reexamination of Leader’s Patent to PTAP even after Facebook had Failed on the Same Arguments 4 Times

Patent Office Director David J. Kappos ordered an unprecedented third reexamination of Leader’s patent to the Patent Trial and Appeals Board ("PTAB"), even after Facebook had failed on the same arguments four times previously. Kappos assigned a former IBM and Microsoft employee, Stephen C. Siu, as chief judge and staffed this PTAB kangaroo court with IBM, Xerox and Microsoft cronies—who had collectively issued over 169 patents to those companies. One of the staff attorneys, William J. Stoffel, even lists Facebook interests as conflicts (IBM, Fidelity, Vanguard). Fidelity and Vanguard were two of the largest pre-IPO mutual fund investors in ...
Read More

IBM sold 750 “junk” patents to Facebook, their Eclipse offspring, Likely so Facebook Could Harass other Tech Companies as IBM had done in the 90’s

IBM holds over 67,000 patents, the most in the world. Pundits know well that IBM does not sell their valuable patents. IBM creates and holds good patents. However, in the late 1990's, David J. Kappos and James P. Chandler began a "patent trolling" program to attack the industry with IBM patents of little or no value. The objective was to use IBM market power to cajole license settlements from smaller companies who did not want the expense of fighting IBM. The patents sold to Facebook are of little or no other value than for industry harassment purposes by Facebook's ...
Read More

KONY 2012: State Propaganda for a New Generation. An Orchestrated Campaign to Justify US Military Intervention in Africa

The overnight viral sensation KONY 2012 brought worldwide awareness to the African war criminal Joseph Kony. Beneath this commendable cause, lies however an elaborate agenda that is presented in the video in a very manipulative way. We’ll look at the agenda behind KONY 2012 and how it uses reverse psychology to not only justify a military operation in Africa, but to actually have people demand it. KONY 2012 is a viral sensation (over 100 million views to date) that swept the entire world in less than 24 hours. Its main subject is the African rebel leader Joseph Kony, his ...
Read More

Hacked Hillary emails prove State Dept. Colluded with Obama and Clinton Foundation to give Globalists like IBM, Cisco, Microsoft and Goldman Sachs control of the Internet

Christina Sass, Clinton Global Initiative (CGI) program director, sent a 62-page CGI leadership email and briefing to two U.S. State Department ambassadors. The primary recipient was Melanne S. Verveer, Hillary Clinton's first-ever Ambassador-at-large for Global Women's Issues. Verveer's husband, Philip L. Verveer, U.S. State Department Ambassador of Telecommunications also received the email. Also copied on the email were Rachel B. Vogelstein, Dept. of State director of global women's issues, and Giulia Marchiori, CGI government relationships director. The documents reveal a revolving door relationship among Barack Obama, the State Department, Bill & Hillary Clinton and The Clinton Foundation, also called ...
Read More

Leader Technologies v. Facebook Patent Infringement Trial Ends in a Split Verdict

On March 17, 2010, Barack Obama nominated Leonard P. Stark to the Delaware District Court bench where the Leader v. Facebook case was proceeding. James Chandler evidently recommended Stark. Vice President Joe Biden sponsored Stark who proceeded to pave the way for State Department (Hillary Clinton) and Patent Office (David J. Kappos) use of Facebook. Leader principals were informed that Facebook claimed that earlier versions of their computer source code did not exist prior to 2009 when Leader’s expert witness, Dr. Giovanni Vigna, University of California, Santa Barbara, forensically examined Facebook’s computer source code. This explanation did not ring ...
Read More

Wikileaks: INDECT to Build Automatic Dossiers on Individuals & Organizations from Web, Mobile, & Social Networking Data

Wikileaks published a document by the INDECT , the Intelligence Information System Supporting Observation, Searching and Detection for Security of Citizens in Urban Environments of Europe. According to Wikileaks, INDECT's "Work package 4" is designed "to comb web blogs, chat sites, news reports, and social-networking sites in order to build up automatic dossiers on individuals, organizations and their relationships." Mind Your Tweets: The CIA Social Networking Surveillance System October 24, 2009 | by Tom Burghardt (Global Research) That social networking sites and applications such as Facebook, Twitter and their competitors can facilitate communication and information sharing amongst diverse groups ...
Read More

Leader Technologies sues Facebook for Patent Infringement. How the Deep State Conspired to Steal their Social Networking Invention as a Tool for Rogue CIA Mind Control

Trillion Dollar Rip-Off: Social Networking is a Stolen Trade Secret One of the largest government sponsored industrial espionage thefts of copyrights, trade secrets, and patents in modern times was the theft of scalable social networking inventions. The technology and programming code that underlie Facebook, Gmail, YouTube, Twitter, Instagram and most the other large-scale social networking companies runs on Leader Technologies' intellectual property. It was stolen by a group of criminal lawyers, judges, spies and bankers working with complete impunity and in total disregard for the law. Under the guise of the IBM Eclipse Foundation, James P. Chandler III (who ...
Read More

Google Chrome Released: An Addition to Googles Products Designed to Create Personal Dossiers on Everyone

Google Chrome was first publicly released on September 2, 2008 for Windows XP and later, with 43 supported languages, officially a beta version, and as a stable public release on December 11, 2008. On the same day, a CNET news item drew attention to a passage in the Terms of Service statement for the initial beta release, which seemed to grant to Google a license to all content transferred via the Chrome browser. This passage was inherited from the general Google terms of service, and Google would quickly remove the controversial passage from their Terms of Service. As of September 2016, StatCounter estimates ...
Read More

Clinton Pay-to-Play with Close Friend, Frank Giustra, involving Energy Deals, Uranium, and Election Fraud to put Nazarbayev back in as President of Kazakhstan

Bill Clinton accompanied his Canadian friend, Frank Giustra, to a hastily arranged meeting with Kazakhstan's strongman president, Nursultan A. Nazarbayev in Almaty, Kazakhstan. The stated agenda was "philanthropic interests," but the real subject was uranium mining to which Giustra was a newcomer. Senator Hillary Clinton publicly criticized Nazarbayev for his poor human rights record, but the Clinton ground game appears to have been a different story. Goldman Sachs, who has large uranium trading interests, and from who Bill Clinton had just months earlier received $650,000 in speaking fees (Dec. 03, 2004, Apr. 20, 2005, Jun. 06, 2005, Jun. 13, ...
Read More

Google Launches Gmail – an Email Scanning Service for the Rogue Government

Google launched its Gmail service, overseen by Sheryl K. Sandberg, Google VP of Operations. When you use Gmail, Google’s email service, the company scans the content of your emails and the email addresses of your correspondents. Google’s Gmail system also scans your incoming emails, even the ones coming from Yahoo and Hotmail. If you feel safe because you’ve deleted emails you regretted sending, think again. Google never erases its own copies, even copies of the drafts you decided not to send – even copies of incomplete messages you didn’t save as drafts. And then there are those Google servers, ...
Read More

Facebook Launches

The question has been raised that Facebook has connections with the CIA. Facebook has a resource that any secret intelligence agency in the world would die for. Access to over 60 million people's names, addresses, friends, activities, details about them, even phone numbers and emails. Given the CIA's shady past and repertoire for doing anything to get information they need (You've seen the news yourself: torture, abuse, abductions). You would consider that at least one secret agency would attempt to contact Facebook, the fastest growing social networking website in the world. "Facebook may also collect information about you from ...
Read More

Mark Zuckerberg Hacks the Harvard House Sites

This is the night of sophomore Mark Zuckerberg’s infamous hacking of the Harvard house sites. Zuckerberg wrote in his online diary that night: "let the hacking begin." Zuckerberg lived in Kirkland House, just a stone’s throw from Winthrop. McKibben’s oldest son, now a surgeon, was a Harvard University student and member of the football team. He lived in Winthrop House as a junior on McKibben said "my son had a number of confidential emails from me discussing our invention in his Email Inbox." Mark Zuckerberg didn't hack into the Harvard student records databases, at least directly. Most Harvard houses ...
Read More

The Eclipse Foundation Releases Version 2.0.1 with Source Code Containing Substantial Innovations From Leader Technologies. IBM would Claim Copyrights.

The Eclipse Foundation (IBM, Xerox, Hoffman La Roche, Bill Fenwick, Fenwick & West LLP) formed November 29, 2001 released Version 2.0.1. of a social networking software. IBM would claim copyrights. The source code contains substantial innovations from Leader Technologies supplied to IBM / Eclipse via James P. Chandler. James Chandler (also Leader's patent counsel at this time) met with Montgomery County, Maryland development officers to negotiate office space for his organization, NIPLI (National Intellectual Property Law Institute), the U.S. Patent Office and IBM. (Whistleblower notes of this 8/30/2002 meeting: "IBM - incorporating [Eclipse Foundation] members, Business Model, different from ...
Read More

NSA Whistleblowers, William Binney & Kirk Wiebe, Resign Because of the NSA’s ‘Acting in Deliberate Violation of the Constitution’ with Massive Spying

William Binney and J. Kirk Wiebe are  National Security Agency (NSA) whistleblowers who worked at the agency in excess of 36 years. As Technical Director, Binney developed a revolutionary information processing system called ThinThread that, arguably, could have detected and prevented the 9/11 terrorist attacks, but NSA officials ignored the program in favor of Trailblazer, a program that not only ended in total failure, but cost taxpayers billions of dollars. Concerned over national security, Binney and Wiebe blew the whistle on the mismanagement over Trailblazer, using internal channels to share their concerns with Congress and the Department of Defense ...
Read More

Judicial Conference: Begins the “Safe Harbor” Mutual Fund Washington Bribery Scam that Permits Judges and Politicians to Hide Stock Without Disclosing the Holdings or the Conflicts of Interest

On Mar. 14, 2001, the Judicial Conference made sweeping changes to its ethics advisory, opening the door for widespread abuse of mutual fund exemptions that gave judges and judicial employees an excuse to hide their investments in deep-pocketed litigants behind a so-called mutual fund "safe harbor" opinion. James Chandler's influence in these changes is confirmed by Washington, D.C. sources. Jan Horbaly, Clerk of Court and Executive, Federal Circuit, was a key participant in these mutual fund reporting exemption changes exploited by the Federal Circuit panels in Leader v. Facebook, where Horbaly was the Clerk, yet failed to disclose his ...
Read More

Vice President Al Gore told CNN’s Wolf Blitzer: “I took the initiative in creating the Internet.”

Vice President Al Gore told CNN's Wolf Blitzer: "I took the initiative in creating the Internet." Hindsight shows that this may have been a Freudian slip since Gore was in on the planning of the Deep State shadow government's plan for a rogue element within the C.I.A. to take over the Internet. Since Clinton came to power in 1993, this global surveillance grid was not accountable to Congress and thus was outside U.S. Constitutional checks and balances ...
Read More
Loading...