vThe followed the first implementation of P2P

vThe concept of Peer-Peer networking is the process of computer networks in which each computer can act as a server for the other computers. This allows shared access to files and peripherals without the need for an central server.The first thought of development of P2P(Peer to Peer) was established in 1969. It was the first RFC(Request for Comments). However, the first implementation of a P2P network was Usenet. This implementation was in the year,1979. The next fifthteen plus years, which followed the first implementation of P2P were uneventful. This would continue until the late 1990’s, when a surge in popularity centered around P2P’s capabilities came about. The way the security of P2P works is by “encrypting P2P traffic, the hope is that not only will the data be safely encrypted, but more importantly, the P2P data stream is encrypted and not easily detectable. With the actual connection stream completely encrypted, it becomes much harder for the P2P traffic to be detected, and, thus, attacked, blocked, or throttled.”(1). The other aspect of how P2P security works would be “anonymizing peers, the P2P network can protect the identity of nodes and users on the network, something that encryption only cannot ensure.”(1). They both work together, well in conjunction, making P2P rather useful in the security field at times. An example of P2P security being faulty, would be the following “1. A new node supplying a legitimate nodeID but falsifying information in its own routing table. 2. A new node supplying a fake nodeID that is meant to cause harm to the operation of the overlay. 3. The same new node joining an overlay repeatedly with different nodeIDs. 4. A set of nodes conspiring together with fake values for nodeID to disrupt the operation of the overlay”.These problems are consistent and are overwhelming present in P2P thus making P2P, weak on the security side at times. The competition to P2P is client-server, which is “a program relationship in which one program (the client) requests a service or resource from another program (the server).”.It is considered to be more corporate oriented rather consumer based, rather than its counterpart P2P. A protocol is defined as the following “a protocol is the special set of rules that end points in a telecommunication connection use when they communicate. Protocols specify interactions between the communicating entities.”(3). An example of a protocol is “Transmission Control Protocol (TCP), which uses a set of rules to exchange messages with other Internet points at the information packet level”(3).An example of a P2P protocol would be Content-Addressable Network(CAN). The protocol (CAN) does the following “main tasks are insertion, deletion, and lookup of any key pair, i.e.. (file, location).  CAN creates a virtual coordinate system with the points in that system representing files.When searching for a file the request is mapped to a point P in the logical Cartesian coordinate system.  Once at node P, then the key is looked up in a Hash table that eventually returns the location of the node which has the file.”(4). The most influential protocols were mainly based upon illegal activities, such pirating music,movies,games and etc. This was during the surge in popularity during the late 1990’s, where protocols for illegal file sharing started appearing, like “Freenet, Napster, Direct Connect, Gnutella, eDonkey2000, and BitTorrent.”(4). The main cause of these protocols coming to be links to when in “July 1997:Hotline Communications was founded, giving consumers software that lets them offer files for download from their own computers.”(5). The category which most of these protocols fall under would be Centralised Peer-to-Peer. The other protocols fall under Decentralised Peer-to-Peer. The layout of Centralised Peer-to-Peer is similar to a client server model. It is quite large compared Decentralised Peer-to-Peer, but is more expensive. The opposite is said about Decentralised Peer-to-Peer, it has a layout which is small, and less expensive than Centralised Peer-to-Peer networking. The most important P2P protocols which fall under Centralised Peer-to-Peer are the following [email protected] and Napster.The important P2P protocols which fall under Decentralised Peer-to-Peer are the following GNUtella and Freenet. The way Napster works is by relying on the server, and the server having certain roles such as a “a searchable index that contains entries of mp3s that all the currently connected clients contain.” However, this may misinterpreted, what the server truly is and the following helps make this clear “the server is actually multiple very hi-spec machines load balancing the requests from clients. This makes scaling the service simply a matter of adding machines into the server pool and ensures redundancy in the fact that servers can fail and be replaced without significant disruption to the service they are providing. Redundancy needs to be implemented for the connection between client and server as well so the servers are placed on multiple connections to different large ISPs.”(5).The way, the protocol Gnutella works is by “Searching on GNUtella is accomplished by creating a keyword string that describes the file you want and broadcasting that string to all your connected neighbours. Your neighbours will then in turn broadcast that message to all their connected neighbours and so on until the packet’s TTL has been reached.”(5). The way, the protocol Freenet works is by “Searching and retrieval are one in the same. To search for a resource on the network you must first know its title.” It then will “hash that title using SHA1 and do a request for that by sending it to the most likely place to have the resource, based, once again, on key closeness.” Afterwards, the first search begins while searching for nodes that contain the required key.” It will then “search backtracks if the hop count reaches zero or if during the search a node is seen twice.”. Then finally, the required resource had found the “search terminates and the client with the resource begins to send the matching resource back along the search route to the client who requested the resource. All clients along the way will cache the passing data which aids in the replication of popular resources and means that frequently requested data is cached and dispersed widely around the network increasing redundancy and reducing access times.”(5).The last protocol to be discussed will be [email protected], which its process can be described as “The clients need very little functionality. They have the ability to do calculations on the data provided and are able to communicate with the central servers. The clients run the calculations continuously with a low priority and communicate with the server only to return results and to ask for new data.”(5).It was “During January 1999:Shawn Fanning, 18, creates the Napster application and service while a freshman at Northeastern University. This would become the first P2P protocol that would become well known, and synonymous with illegal activities. It was “During May 1999: a company named Napster Inc. was founded.” (5). This was a big step, because of the company being instituted, it meant that legal action could be brought upon the company for its actions. It was “June 1999:when London programmer Ian Clarke completes the original Freenet design as a student at Edinburgh University, Scotland, and makes it available on the Internet”(5).        The date was on December 7 1999,when an record industry had sued Napster for copyright infringement of their product. This began many companies going after protocols and the creators with lawsuits. In some cases like when in “April 10 2000: AOL shuts down the Gnutella project.”(5). This would cause P2P, to be contributor to a growing market of illegal pirating. It’s name was cemented by companies who would soon be gone. After being driven to bankruptcy, due to repeated illegal pirating. It would soon lose its footing in the corporate market due to its bad reputation and become synonymous with illegal activity. However, it would soon gain footing in the consumer market, and now it is the dominant power in the consumer market for its field.A physical connection is it a physical connection between two points, but also including a larger network.  All of the nodes, such as the switches and routers are between these two points. The physical connection in Peer to peer in consistent and is one of the foundations of the P2P model. The way it has been implemented seemed to have not changed, because of the same procedure being used on current protocols and methodologies. The way, it is different from client server is from it being with computer to computer layout, rather computer to server layout like client server is. It seems to be a simple connection rather a complicated connection. A true multi user is use of a piece of software which be used by more than one user.It is important from how most protocols rely on file sharing as their key concept which is one of the main features of P2P. If it weren’t a true multi user, it would lose this feature of file sharing, making the creation of the previous protocols irrelevant. P2P is considered to be a true multi user, from how if it weren’t some aspects of it, would not be possible. After, the popularity of Napster and other such corporations, fading away and using the same procedures. It became common place for P2P to be part illegal online pirating. It got bad reputation from also this popularity but after much turmoil, it would become a part of consumer use. It would lose its place as a competitor to client server in the corporate field but would dominate the consumer field instead. This ended P2P, where it is today, a staple in the consumer market, of connectivity. This would be from how small, and cheap it is to use, making it a good choice for consumer use.


I'm Isaac!

Would you like to get a custom essay? How about receiving a customized one?

Check it out