Sunday, May 30, 2010

Wire free networking- The Wi-Fi world




As the name indicates, Wireless Networking means no cables or wires required to network your computers and share your Internet connection. Wi-Fi connects computers, printers, video camera's and game consoles into a fast Ethernet network via microwaves.
A wireless LAN is the perfect way to improve data connectivity in an existing building without the expense of installing a structured cabling scheme to every desk. Besides the freedom that wireless computing affords users, ease of connection is a further benefit. Problems with the physical aspects of wired LAN connections (locating live data outlets, loose patch cords, broken connectors, etc.) generate a significant volume of helpdesk calls. With a wireless network, the incidence of these problems is reduced.
A range of wireless network technologies have or will soon reach the general business market, wireless LANs based on the 802.11 standard are the most likely candidate to become widely prevalent in corporate environments. Current 802.11b products operate at 2.4GHz, and deliver up to 11Mbps of bandwidth – comparable to a standard Ethernet wired LAN in performance. An upcoming version called 802.11a moves to a higher frequency range, and promises significantly faster speeds. It is expected to have security concerns similar to 802.11b.This low cost, combined with strong performance and ease of deployment, mean that many departments and individuals already use 802.11b, at home or at work – even if IT staff and security management administrators do not yet recognize wireless LANs as an approved technology. Without doubt, wireless LANs have a high gee-whiz factor. They provide always-on network connectivity, but don’t require a network cable. Office workers can roam from meeting to meeting throughout a building, constantly connected to the same network resources enjoyed by wired, desk-bound coworkers. Home or remote workers can set up networks without worrying about how to run wires through houses that never were designed to support network infrastructure. Wireless LANS may actually prove less expensive to support than traditional networks for employees that need to connect to corporate resources in multiple office locations. Large hotel chains, airlines, convention centers, Internet cafes, etc., see wireless LANs as an additional revenue opportunity for providing Internet connectivity to their customers. Wireless is a more affordable and logistically acceptable alternative to wired LANs for these organizations. For example, an airline can provide for-fee wireless network access for travelers in frequent flyer lounges – or anywhere else in the airport. Market maturity and technology advances will lower the cost and accelerate widespread adoption of wireless LANs. End-user spending, the primary cost metric, will drop from about $250 in 2001 to around $180 in 2004 (Gartner Group). By 2005, 50 percent of Fortune 1000 companies will have extensively deployed wireless LAN technology based on evolved 802.11 standards (0.7 probability). By 2010, the majority of Fortune 2000 companies will have deployed wireless LANs to support standard, wired network technology LANs (0.6 probability).
For the anticipated future wireless technology will complement wired connectivity in enterprise environments. Even new buildings will continue to incorporate wired LANs. The primary reason is that wired networking remains less expensive than wireless. In addition, wired networks offer greater bandwidth, allowing for future applications beyond the capabilities of today’s wireless systems. Although it may cost 10 times more to retrofit a building for wired networking (initial construction being by far the preferred time to set up network infrastructure), wiring is only a very small fraction of the cost of the overall capital outlay for an enterprise network. For that reason, many corporations are only just testing wireless technology. This limited acceptance at the corporate level means few access points with a limited number of users in real world production environments, or evaluation test beds sequestered in a lab. In response, business units and individuals will deploy wireless access points on their own. These unauthorized networks almost certainly lack adequate attention to information security, and present a serious concern for protecting online business assets.
Finally, the 802.11b standard shares unlicensed frequencies with other devices, including
Bluetooth wireless personal area networks (PANs), cordless phones, and baby monitors. These technologies can, and do, interfere with each other. 802.11b also fails to delineate roaming
802.11b’s low cost of entry is what makes it so attractive. However, inexpensive equipment also makes it easier for attackers to mount an attack. “Rogue” access points and unauthorized, poorly secured networks compound the odds of a security breach.
Although attacks against 802.11b and other wireless technologies will undoubtedly increase in number and sophistication over time, most current 802.11b risks fall into seven basic categories like, Insertion attacks, Interception, unauthorized monitoring of wireless traffic and Jamming.
With all its advantages, the major issue related to it is security, anyone within the geographical network range of an open, unencrypted wireless network can 'sniff' or record the traffic, gain unauthorized access to internal network resources as well as to the internet, and then possibly sending spam or doing other illegal actions using the wireless network's IP address, all of which are rare for home routers but may be significant concerns for office networks. There are three principal ways to secure a wireless network.
For closed networks (like home users and organizations) the most common way is to configure access restrictions in the access points. Those restrictions may include encryption and checks on MAC address. Another option is to disable ESSID broadcasting, making the access point difficult for outsiders to detect. Wireless Intrusion Prevention Systems can be used to provide wireless LAN security in this network model.
For commercial providers, hotspots, and large organizations, the preferred solution is often to have an open and unencrypted, but completely isolated wireless network. The users will at first have no access to the Internet nor to any local network resources. Commercial providers usually forward all web traffic to a captive portal which provides for payment and/or authorization. Another solution is to require the users to connect securely to a privileged network using VPN.
Wireless networks are less secure than wired ones; in many offices intruders can easily visit and hook up their own computer to the wired network without problems, gaining access to the network, and it's also often possible for remote intruders to gain access to the network through backdoors like Back Orifice. One general solution may be end-to-end encryption, with independent authentication on all resources that shouldn't be available to the public.


Wireless LAN security has a long way to go. Current Implementation of WEP has proved to be flawed. Further initiatives to come up with a standard that is robust and provides adequate security are urgently needed. The 802.1x and EAP are just mid points in a long journey. Till new security standard for WLAN comes up third party and proprietary methods need to be implemented.

DAS: Perfect for Local Data Sharing




DAS is a type of storage that is connected directly to the server which enables quick access to the data but only through the server.

A network storage system helps organize and save critical information created on a computer in an efficient and accessible manner. Direct Attached Storage is an extremely versatile dedicated solution that addresses many storage problems. Its most common uses are server expansion and low-cost clustering. Direct-attached storage, or DAS, is the most basic level of storage, in which storage devices are part of the host computer, as with drives, or directly connected to a single server, as with RAID arrays or tape libraries. Network workstations must therefore access the server in order to connect to the storage device. This is in contrast to networked storage such as NAS and SAN, which are connected to workstations and servers over a network. As the first widely popular storage model, DAS products still comprise a large majority of the installed base of storage systems in today's IT infrastructures.
Although the competition of networked storage is growing at a faster rate than ever but direct-attached storage, is still a viable option by virtue of being simple to deploy and having a lower initial cost when compared to networked storage. In order for clients on the network to access the storage device in the DAS model, they must be able to access the server it is connected to. If the server is down or experiencing problems, it will have a direct impact on users' ability to store and access data. In addition to storing and retrieving files, the server also bears the load of processing applications such as e-mail and databases. Network bottlenecks and slowdowns in data availability may occur as server bandwidth is consumed by applications, especially if there is a lot of data being shared from workstation to workstation.
DAS is ideal for small businesses or departments and workgroups that do not need to share information over long distances or across an enterprise and localize file sharing in environments with a single server or a few servers. Small companies traditionally utilize DAS for file serving and e-mail, while larger enterprises may leverage DAS in a mixed storage environment that likely includes NAS and SAN. DAS also offers ease of management and administration in this scenario, since it can be managed using the network operating system of the attached server. However, management complexity can escalate quickly with the addition of new servers, since storage for each server must be administered separately.
From an economical perspective DAS is a cost-effective storage solution for small enterprises though limited in its scalability. It is ideal for setups that rely on localized file sharing and there is no need to transfer files over long distances. Enterprises that begin with DAS but later shift to networked solutions can use DAS to store less critical data. A single enclosure DAS offers some advantages – these include an easy to manage connection that can be managed with minimal skills. This is because the cabling is an integral part of the cabinet with the server. DAS is a general-purpose solution for all types of storage processing.
Organizations that do eventually transition to networked storage can protect their investment in legacy DAS. One option is to place it on the network via bridge devices, which allows current storage resources to be used in a networked infrastructure without incurring the immediate costs of networked storage. Once the transition is made, DAS can still be used locally to store less critical data.
With so many plus points, it has some drawbacks like, a single enclosure DAS design include poor scalability and limited disk capacity. This means that DAS cannot be used as the only storage medium for an enterprise environment. Poor scalability adds to the complexities in managing the storage environment. DAS does not allow for good management practices where a single data repository image is maintained. DAS does not provide the uptime or security that is associated with a SAN or NAS configuration. Disk consolidation with DAS is not feasible.
A multiple external enclosure DAS design offers the advantage of speedier recovery in case complete server hardware takes place. Storage capacity is in terabytes and greater than the internal capacity of a computer. On the flip side, a multiple external enclosure DAS adds to the complexity of management; it is more expensive than an internal solution and has greater space requirements. When setting up DAS, the following aspects regarding hard disks should be taken into consideration – disk capacity, disk I/O, and hard disk connectivity.
Large-scale DAS deployments can be a little difficult to secure because of the distributed nature of the servers. DAS security includes server security policies and access limitations to the server – both physical and over a network. DAS hosted on Windows servers can be made secure by using group policies. DAS scores well on the manageability front so long as scalability is not an issue. Backup and recovery of DAS storage can be done over LAN; but this adds to the LAN traffic and can slow down applications. A solution is to add another network to be used solely for backup and recovery but such a solution adds to the management complexity and may not be adequate for very large databases.
With DAS, redundancy is provided at the disk or controller level because with locally attached storage the fault tolerance is taken care of by localized DAS technologies. System-level redundancies cost more and in the event of a server problem the attached storage may be unavailable to users. In order to improve data accessibility the Windows Cluster service can be deployed to provide redundant hosts that share the storage subsystem. RAID configurations also add to the redundancy.
In terms of performance DAS storage delivers well because the processor and disk are situated close to each other. Any effort to scale DAS can result in performance levels falling because the storage and applications share the same set of resources. Unlike NAS and SAN which use dedicated resources for storage processing, DAS affects the LAN passage because of storage-related traffic.
Like all industries, storage networking is in a constant state of change. It's easy to fall into the trap of choosing the emerging or disruptive storage technology at the time. But the best chance for success comes with choosing a solution that is cost-correct and provides long term investment protection for your organization. Digital assets will only continue to grow in the future. Make sure your storage infrastructure is conducive to cost-effective expansion and scalability. It is also important to implement technologies that are based on open industry standards, which will minimize interoperability concerns as you expand your network.

A DAS is a dedicated storage device that's added to your environment. It's an ideal solution for applications requiring a lower-cost, entry-level cluster to maintain availability. And if a person is simply looking for an economical way to expand storage, than DAS is a smart alternative.

Scalable On-demand: highly scalable, whether to add
146GB or 10TB, disks can be added as per need them
Flexible: multiple configuration options for a variety of storage needs including transactional databases, media downloads and archiving
Dedicated: dedicated solution ensures only one accessing data on drives and can help satisfy requirements for certain compliance programs
Easy: adding a DAS is easy on budget and eliminates the complexities of growing storage by adding another server to your configuration.

The Magic World of 4G



A strong need exists to combine both the wireless (LAN) concept and cell or base station wide area network design. 4G is seen as the solution that will bridge that gap and thereby provide a much more robust network.
Technology is versatile and changing speedily with time. Following the evolutionary line of cell phone technology standards that has spanned from 1G, 2G, 2.5G to 3G, 4G describes the entirely brave new world beyond advanced 3G networks.
4G, which is also known as “beyond 3G” or “fourth-generation” cell phone technology, refers to the entirely new evolution and a complete 3G replacement in wireless communications. A successor to 2G and 3G aiming to provide the very high data transfer rates. This technology can provide very speedy wireless internet access to not only stationary users but also to the mobile users. This technology is expected to trounce the deficiencies of 3G technology in terms of speed and quality. 4G can be best describe as a term that stands for Mobile multimedia Anytime Anywhere Global mobility support, integrated wireless and personalized services.
But at this time nobody exactly knows the true definition for 4G technology. However it has been used often to denote a fast internet access available to mobile phone users. More over the distinguishing feature of high multimedia streaming and end to end IP configuration is judged to be its MAGIC enchantment. 3G has WiMax and WiFi as separate wireless technologies, whereas 4G is expected to combine these two technologies. The efficiency of 4G can be easily estimated, by the way it would coalesce two extremely reliable technologies. 4G can greatly anticipate in evolving and advancing the pervasive computing. The aim of pervasive computing is to attach itself to every living space possible, so that human beings remain intact with the wireless technology intentionally and unintentionally. Therefore 4G is be able to connect various high speed networks together, which would enable each one of us to carry digital devices even in dispersed locations. The network operators worldwide would be able to deploy wireless mesh networks and make use of cognitive radio technology for widespread coverage and access. Someday 4G networks may replace all existing 2.5G and 3G networks, perhaps even before a full deployment of 3G. Multiple 3G standards are springing up that would make it difficult for 3G devices to be truly global. A strong need exists to combine both the wireless (LAN) concept and cell or base station wide area network design. 4G is seen as the solution that will bridge that gap and thereby provide a much more robust network.
With these advantages there are some major challenges in realising the 4G vision. The first major concern is power consumption. This is getting critical with adding up multiple processing and communication elements to drive higher levels of MIPS (throughput) in mobile devices. All of these elements will increase current drain. Additional hardware acceleration technology is going to be required to manage power in this kind of environment and, with the emergence and use of OFDM-based technology as crucial to managing some of the process streams and power challenges in these kinds of applications and devices.
The second challenge is spectral efficiency, which is largely a matter of availability. In order for more spectrum to be made available, the option is either re-farm existing spectrum in 2G and analogue broadcast TV or open up higher-frequency bandwidths. Further improvements in spectral efficiency can be derived from the use of cognitive radio. Dramatic innovations will be required to deliver on that promise.
Third significant challenge is cost, related to infrastructure, operating or handset cost, it is also include the cost of deploying services. There are a variety of challenges in this area that come along with the network topology required for a 4G system.
First of all, to deliver the spectral efficiency and coverage required, we will have to see a dramatic growth in the number of basestations. To support the kinds of services that consumers increasingly expect, we will need as much as three times more basestations to deliver a ten-fold increase in data rate.One way to reduce basestation density is by applying advanced antenna techniques such as MIMO and space-time coding (STC). These techniques can improve spectral efficiency to reduce the number and growth rate of basestations. They can do this and still achieve the kind of coverage required to deliver the bandwidth necessary for the applications consumers want.
There are capital costs associated with growth in the number of basestations required to deliver coverage at high data rates. On the handset side, there are significant challenges in continuing to drive down the cost of integrating greater and greater processing capability in multimode RF technology. From a carrier perspective, the affordability of managing, billing and distributing content over these networks to drive revenue to recover those higher operating costs is another challenge in realising a 4G vision.
Everybody is still wondering what the 3G application is, and people are already getting into 4G technologies, mobile media players, internet access, broadcast technology and other types of corporate aggregations will become more robust and will drive average revenue per user (ARPU) in the carrier space.
Adding on to this is Miniaturisation challenges that include power reduction, cost, size and product development cycle. Multimode technology in 4G means we have to be able to hand off the different types of radio access technologies in a seamless way. There are significant software, billing, carrier interoperability and enterprise carrier interoperability challenges. On the multimedia side, it is obvious that with rich digital media content come dramatic processing challenges for mobile devices.
As its obvious that 4G is not going to be driven by a single entity or organisation. It will require a tremendous number of partnerships and a robust ecosystem, so exploitation of the capabilities that are available in wireless technologies is certain. Given the sweeping changes in the world of technology, it is going to require multiple standards bodies, corporations and government entities to come together to drive standards-based interoperability and the opportunity to deliver 4G networks. Governments will have to manage the spectrum in different parts of the world, and this will have a dramatic impact on how we can exploit the capabilities available to us in wireless technologies.
Traditional equipment vendors have historically operated at layers 1–3. Wireline internet access is increasingly being challenged to improve security. Security has multiple elements, much more than just moving encrypted traffic at faster and faster rates across the network. Security is also about denial of service attacks and digital rights management. These are all becoming carrier problems.

4G is a multi purpose and versatile technology hence it can utilize almost all of the packet switched technologies. It can use both orthogonal frequency division multiplexing (OFDM) and orthogonal frequency division multiple access (OFDMA). OFDM mechanism splits a digital signal into different narrowband and frequencies. 4G is also capable of using multiple input / multiple output technology (MIMO).this antenna technology is used to optimize the data speed and reduce the errors in the networks.
The flexibility of 4G technologies to be used in combination with GSM and CDMA has provided it an edge over other technologies. The reason is that the high broadband capability of 4G not only increases data streaming for stationary users but also for mobile users.4G can be efficiently combined with cellular technologies to make consistent use of smart phones. The digital cameras attached in smart phones can be used to establish video blogs in scattered geographical regions. This gives the manufactures the opportunity to produce more affordable user friendly 4G compatible devices. Famous iPod is one such device that supports the working of video blogs. Hence 4G is capable of providing new horizon of opportunity for both existing and startup telephone companies.

4G delivers true mobile broadband for the masses with a superior user experience. Nortel is boosting the adoption of mobile multimedia and the delivery of a true mobile broadband experience through our leadership in 4G-enabled technologies - LTE (Long Term Evolution) and IMS (IP Multimedia Subsystem).4G mobile broadband provides improved performance, lower total cost of ownership and enables a new era of personalized services. 4G networks are IP-based and flatter with fewer nodes to manage. The benefits are significant and can make 4G mobile broadband a truly disruptive technology providing service providers a cost-effective way to deploy next generation technology and services and redefining the end-user experience.

The next industry buzz words: Cloud Computing




Cloud computing is massively scalable, provides a superior user experience, and is characterized by new, internet-driven economics

Information technology is like an invisible layer that increasingly touches every aspect of our lives and the dependence on it is growing faster than ever. Another baby it has delivered to make planet smarter and, ready to set a trend is cloud computing.
Cloud computing is the most vague topic, rather is one of those topics that often educes a mixed reaction in the tech world. Businesses see it as a strong cost savings at a time, still many IT people have their doubts, expressing worries over the security, safety, and reliability of farming out data and services to cloud provider. With its due share of hullabaloo, it Cloud comes into focus only when you think about what IT always needs: a way to increase capacity or add capabilities on the fly without investing in new infrastructure, training new personnel, or licensing new software. Cloud computing encompasses any subscription-based or pay-per-use service that, in real time over the Internet, extends IT's existing capabilities.
It’s a new generation of computing that utilizes distant servers for data storage and management, allowing the device to use smaller and more efficient chips that consume less energy than standard computers. Cloud computing allows consumers and businesses to use applications without installation and access their personal files at any computer with internet access. This technology allows for much more efficient computing by centralizing storage, memory, processing and bandwidth.
A simple example of cloud computing is Yahoo email or Gmail etc. People don’t need software or a server to use them. All a consumer would need is just an internet connection and you can start sending emails. The server and email management software is all on the cloud (Internet) and is totally managed by the cloud service provider Yahoo , Google etc. The consumer gets to use the software alone and enjoy the benefits.
The term cloud computing probably comes from (at least partly) the use of a cloud image to represent the Internet or some large networked environment. It is a technology used to access services offered on the Internet cloud. Everything an informatics system has to offer is provided as a service, so users can access these services available on the “Internet cloud” without having any previous know-how (or at least not as an expert) on managing the resources involved.

The term "cloud computing" encompasses many areas of tech, including software as a service, a software distribution method pioneered by Salesforce.com about a decade ago. It also includes newer avenues such as hardware as a service, a way to order storage and server capacity on demand from Amazon and others.
Cloud computing is broken down into three segments: "sevices," "platforms," and "infrastructure." Each segment serves a different purpose and offers different products for businesses and individuals around the world.
Infrastructure-as-a-Service like Amazon Web Services provides virtual server instances with unique IP addresses and blocks of storage on demand. Customers use the provider's application program interface (API) to start, stop, access and configure their virtual servers and storage. In the enterprise, cloud computing allows a company to pay for only as much capacity as is needed, and bring more online as soon as required. Because this pay-for-what-you-use model resembles the way electricity, fuel and water are consumed; it's sometimes referred to as utility computing.
Platform-as-a-service in the cloud is defined as a set of software and product development tools hosted on the provider's infrastructure. Developers create applications on the provider's platform over the Internet. PaaS providers may use APIs, website portals or gateway software installed on the customer's computer. Force.com, (an outgrowth of Salesforce.com) and GoogleApps are examples of PaaS. Developers need to know that currently, there are not standards for interoperability or data portability in the cloud. Some providers will not allow software created by their customers to be moved off the provider's platform.
In the software-as-a-service cloud model, the vendor supplies the hardware infrastructure, the software product and interacts with the user through a front-end portal. SaaS is a very broad market. Services can be anything from Web-based email to inventory control and database processing. Because the service provider hosts both the application and the data, the end user is free to use the service from anywhere.
The major issue slowing cloud computing growth is security. No matter how many security management tools are released or assurances of reliability are made, complications with data privacy and data protection continue to plague the market. Privacy is another matter. If a client can log in from any location to access data and applications, it's possible the client's privacy could be compromised. Cloud computing companies will need to find ways to protect client privacy. One way is to use authentication techniques such as user names and passwords. Another is to employ an authorization format -- each user can access only the data and applications relevant to his or her job.
Cloud computing is considered to be a paradigm shift in the computing industry. The shift would affect companies a few different sub-industries including software companies, internet service providers and hardware manufacturers. Companies in each of these industries will face significant change if cloud computing is to be the next step for the industry. While it is relatively easy to see how the main software and internet companies will be affected by such a shift, how companies in the internet and hardware will be affected is slightly more difficult, because if companies switch to using streamlined computer systems, they'll have fewer IT needs. Some industry experts believe that the need for IT jobs will migrate to the back end of the cloud computing system.

Cloud computing really is accessing resources and services needed to perform functions with dynamically changing needs. An application or service developer requests access from the cloud rather than a specific endpoint or named resource. What goes on in the cloud manages multiple infrastructures across multiple organizations and consists of one or more frameworks overlaid on top of the infrastructures tying them together. Frameworks provide mechanisms for self-healing, self monitoring, resource registration and discovery, service level agreement definitions, automatic reconfiguration

Tuesday, May 25, 2010

Wire up with wireless network




A basic wireless network consists of multiple stations communicating with radios that broadcast in either the 2.4GHz or 5GHz band

Wireless network is a network set up by using radio signal frequency to communicate among computers and other network devices. This network is getting popular nowadays due to easy to setup feature and no cabling involved. You can connect computers anywhere in your home without the need for wires.

Wireless Personal Area Networks (WPANs) interconnect devices within a relatively small area, generally within reach of a person. For example, Bluetooth provides a WPAN for interconnecting a headset to a laptop. Wi-Fi often use dedicated microwave or laser beams over line of sight paths. It is often used in cities to connect networks in two or more buildings without physically wiring the buildings together. Wireless Metropolitan area networks (WMANS) are a type of wireless network that connects several Wireless LANs. Wireless Wide Area Networks (WWANs) are wireless networks that typically cover large outdoor areas. These networks can be used to connect branch offices of business or as a public internet access system. They are usually deployed on the 2.4 GHz band.
With the development of smart phones, cellular telephone networks routinely carry data. Like the GSM (Global System for Mobile Communications) network is divided into three major systems: the switching system, the base station system, and the operation and support system. The cell phone connects to the base system station which then connects to the operation and support station; it then connects to the switching station where the call is transferred to where it needs to go. GSM is the most common standard and is used for a majority of cell phones. Personal Communications Service (PCS) is a radio band that can be used by mobile phones in North America and South Asia. Sprint happened to be the first service to set up a PCS. D-AMPS is Digital Advanced Mobile Phone Service, an upgraded version of AMPS, is being phased out due to advancement in technology. The newer GSM networks are replacing the older system.
Since from the time of World War-II, wireless networks have always been very useful in transferring data from one place to other with efficiently and reliably. Cellular phones are one of the best examples, that we use everyday to communicate with others. Sending information overseas using satellites, Emergency services such as the police department utilize wireless networks to communicate important information quickly. In businesses use it to send and share data quickly whether it is in a small office building or across the world. Another feature that is making it more popular is its inexpensive and rapid nature to connect to the Internet in countries and regions where the telecom infrastructure is poor or there is a lack of resources, as in most developing countries.
Apart from all its good point there are some issue that causing problem, like Compatibility, Different components not made by the same company may not work together. Other than that Wireless networks are typically slower than those that are directly connected through an Ethernet cable.
A wireless network is more vulnerable, because anyone can try to break into a network broadcasting a signal, so security is another issue that is causing problem in its way of popularity. In recent times, there have been increased concerns about the health issues, the possible risk of tumors and other diseases due to exposure to electromagnetic fields (EMFs) needs to be further investigated. Other than these issues, wireless communication is gaining popularity due to its flexible nature.

Friday, May 21, 2010

What it takes to be a professional programmer?




Are you a programmer just by chance or it’s your passion to be in the field? In order to become a professional programmer you need to possess a genuine interest in the field. Whenever the term software programmer comes to our mind, the only word strike our chord is ‘technical’. But it takes more to be a professional programmer beyond technical terms. Let see the significance of these qualities:

Trustworthiness - Is the person capable of respecting the privacy of the clients? Can your project manager trust you with sensitive information? If you're given clients' data or have signed a non-disclosure agreement, then you are being trusted to respect privacy. You are trusted to check license agreements on third party tools or libraries and to get licenses or permission as required.
Teamwork - Do you cooperate with your team members, help them when they need it, and do not get involved in official politics? Can you do your share of the work and trust your team to do the rest? And can you accept your management (and sometimes even clients) as part of the team, everyone trying to get the same job done?
Leadership - Do you believe in knowledge sharing and can you delegate your task efficiently? Leadership means both earning respect from others and knowing what to do with it. Recognize the skills of your team members, and make sure you can offer each person challenges and development without exceeding what they can cope with at a given time.
Communication - Teamwork can't happen without good communication, nor can accountability. Communication is critical for helping clients to produce usable specifications and feedback. A professional's communication is effective and to the point, whether in person, in email, on the phone or in written documents.

Updating Skills - Are you aware of the latest methodologies in the industry like eXtreme Programming, new libraries, refactoring tools, standards, file formats and protocols, Unicode, XML, SQL, and all the other acronyms, the platforms that your potential clients are using, about cross platform development etc. Basically you need to possess a genuine interest in your field, and to read broadly so you know what's out there and which areas to then read deeply about.
Minimizing Risks Familiarity - Professional programmers keep track of known bugs or any other change they intend to make. Another risk that's often not properly considered is any and all changes to the source code. Source is your livelihood and any change can be a mistake. Professional programmers are careful to do enough testing. A software company will generally have testers but the developers need to know how to get the most out of testers and also how to write their own unit and regression tests to make sure every change in behavior is noticed and checked.
Accountability Writing code for others is a responsibility. You need to make sure your software is reliable. You need to make sure you and the client truly understand the requirements and specifications. You need to have documentation of your work, all current and past bugs, your progress, any problems, signed-off milestones, and more. You are also required to know about some basic legal issues, like software licensing, the terms of your employment contract.
`

LPO is no longer a fad





The availability of outsourcing ventures with world-class resources and expertise is also boosting the growth of outsourcing by legal entities. Legal Process Outsourcing is a lucrative route to reduce cost and increase efficiency by outsourcing legal works to India.

After the Business Process Outsourcing (BPO) and Knowledge Process Outsourcing (KPO), now it’s the Legal Process outsourcing (LPO) that has newly joined the band wagon. Developing nations like India are the hot spot for outsourcing in today’s global marketplace. LPO is being called the next big thing in Indian business. It marks India's climb up the chain of outsourcing jobs from low-end, back-office service functions in call centers to high-value, skilled legal work.
A LPO company basically have law firms and in-house legal departments with legal support services such as document review, due diligence, legal research, accounting and billing and Intellectual Property Services - quickly, efficiently, confidentially and at a very affordable cost.
LPO is no longer a passing phase, the trend has shifted from domestic law firms to an increasingly competitive global marketplace with massive scope and job opportunities. The clock is ticking fast and with time LPOs in India is becoming the most sought after industry to work with. India's leading Legal Process Outsourcing (LPO) Company, pledged to provide Efficient, Effective and Economical Legal Support Services to its global clientele in the current dynamic and competitive global business environment. LPO expertise is utilized by Corporate Bodies, Law Firms, Research Firms and clientele from various other fields. When the outsourced entity is based in another country the practice is sometimes called Offshoring.
LPO outsourcing is not a new phenomenon, LPO started back in 1995 when US based Bickel and Brewer, law firm sent its legal work to its subsidiary in Hyderabad. Earlier many Lawyers in India, USA and UK were unaware of LPO. People came to know about LPO when an LPO’s in India got success and media gave highlights and projections on that. Now many are into LPO sector with the hope of finding better prospect and creating new history in the LPO industry. Most firms and corporations outsource primarily to save cash, and this is considered the biggest advantage for legal outsourcing. While an attorney in major legal markets such as the US may charge from $150–350 dollars/hour when performing rote services, legal process outsourcing firms can often charge a fraction of this. This has attracted major corporations to outsource specific work outside their legal departments, while the developing countries benefit from the huge inflow of income and job creation. So it’s a business where both the ends having its own share of profit that is one of the reasons that’s making this industry grow fast. The latest industry to join the outsourcing rat race is the legal sector. Legal outsourcing has already created 12,000 job opportunities in India alone and this figure is expected to rise to as many as 79,000 by 2015.
The expertise is conscientious in meeting the objectives of its clients with the support of knowledge and qualified cross-functional team of legal practitioners, professionals and scholars. Easy accessibility, prompt and quality service at an economical cost, backed by high degree of Client Confidentiality and Data Security differentiates Acumen from other service providers.
One of the major concerns with legal outsourcing is the potential for breach of clients’ confidentiality. Another concern is that the people performing legal work may not be bound to the necessary ethical standards. However, there have been ethics opinions from various state Bar Associations (New York, San Diego) and recently, the American Bar Association that discuss ethical legal outsourcing and how to achieve it. Because of the sensitive nature of legal work, Indian outsourcing companies have tried to allay the concerns about confidentiality. They have installed closed-circuit televisions, network safeguards and hack-proof servers. Many outsourcing companies in India already have those security measures in place because they have been handling the credit card and banking operations of global companies for more than a decade. Industry members say that outsourcing of legal work to India is a natural next step.
Law school graduates do not have the only option to argue in an Indian court and eventually become an honorable judge. Lawyers, in recent years have understood the mechanics of globalization and economics and this must be the correct time to implement the conception into legal project management as well. LPO is one area where niche legal services outsourcing firms can provide the much needed exposure and experience. The huge sphere of work dealt by such firms gives the lawyers working their opportunities to learn and execute different principles of legal project management. There are multiple success stories in such companies where an individual who joined in as a legal analyst have in short span of time and owing to challenging nature of work has become a leader of big teams of lawyers and paralegal
Gradually the mindset regarding LPO units is changing among the masses. They now comprehend the multi-dimensionality of work done in big LPO companies and are eager to carve their career in such organizations. The industry offers an attractive career path for many of the 300,000 Indians who enroll in law schools every year and make a row for the aspirant to join the LPO and make lot of money.
Presently, there are over 200 LPO’s in India and the number is expected to rise in next few years. The Industry has grown at over 60 percent year on year. According to a research firm, it has been found that legal services outsourcing in India provides the highest cost savings of 44% in comparison to other markets. The research also revealed that LPO services realise the highest profit margin for providers, with an average margin of 29%, ahead of IT (24%), finance and accounting (21%) and procurement (19%). All these figures are indicative of the fact that legal outsourcing is the best avenue to provide cost saving, which in the present state of economic slowdown is the most coveted prize. The research also highlights the expansion plans of LPO service providers that points the increase in business meted out to them. Indian LPO firms are expecting a rise of over 200% in revenue in the past one year. They have experienced amazing rise in litigation, bankruptcy related document drafting and intellectual property infringement work.
Many countries like Philippines, South Africa and China are also looking forward to expand the business in this area but India’s primacy in the business is unlikely to be effected because none of these nations can deliver the kind of scalability that India offers as yet.

The offshoring legal services business in India is still small- annual revenues account for less than 4 per cent of total BPO revenues- but it was among the fastest growing segments of the BPO business. Certainly, that dramatic, growth reduced in 2008 and 2009. But, now, the business is blooming again, that too significantly beyond most companies’ and analysts’ expectations.

Thursday, May 20, 2010

Cheer up! It’s World Telecommunication Day




We are living in a society, where we are surrounded by Television, Radio, the Internet and Cell phones etc, and can’t even imagine of a moment without these possessions. For quite a sometime India has become hot spot for telecommunication activities like Mobiles, the Internet, Satellite and Cable networks. All thanks to the Information and Communication Technologies (ITCs). There won’t be any vista which is untouched by ITC. It certainly has an impact on our social and economic life and redefined it overall. To commemorate its greater reach, the World Telecommunication and Information Society Day (WTISD) is celebrates on 17 May of every year, across the globe.
Most of us are not aware of the fact that, World information Society Day was proclaimed to be on 17 May by a United Nations General Assembly resolution, following the 2005 World Summit on the Information Society in Tunis. The day had previously been known as World Telecommunication Day to commemorate the founding of the International Telecommunication Union in 17 May 1865. It was instituted by the Plenipotentiary Conference in Malaga-Torremolinos in 1973. In November 2006, the ITU Plenipotentiary Conference in Antalya, Turkey, decided to celebrate both events on 17 May as World Telecommunication and Information Society Day.
The intention behind commemorating WTISD is to raise awareness of the possibilities that the use of the Internet and other ICTs can bring to societies and economies, as well as of ways to bridge the digital divide. In simple terms we can say, it focuses to tone up national policies, fill up the technological differences, and promote connectivity, foster global interoperability of systems and to curb on physical distances globally through the Internet, T.V, Phone etc. and make information and communication more accessible to people residing in the remote area.
This juncture reminds of the disparities between the have and have not among the urban-rural population, those who are far from development, people are deprived of access to the means of communication and information. This is the time we should think of millennium development, bring out change in the society using the ICT, where people are longing for the minimal needs like, sanitation, health, shelter, food etc. ITC has the power to empower the society by providing information and knowledge and its potential should be used for the noble cause.
The day is celebrated on different themes every year. The theme for World Telecommunication Day 1997 was "Telecommunications and Humanitarian Assistance", for World Telecommunication Day 2007 it was “Connecting the Youth, the Opportunities of ICT” and for World Telecommunication Day 2008 it was “Connecting persons with disabilities” and for 2010 the theme is “Better city, better life”.
This year is no difference; countries across the globe are celebrating it in different manner. On this occasion, ITU calls upon all stakeholders - policy makers, regulators, operators and industry - to promote the adoption of policies and strategies that will promote ICTs in urban areas to contribute towards a better life in cities. ITU, UNESCO (United Nations Educational, Scientific and Cultural Organization)
, UNCTAD (United Nations Conference on Trade and Development) and UNDP’s (The United Nations Development Programme) WSIS (World Summit on the Information Society) Forum 2010 conference scheduled to be held from 10 to 14 of May 2010 at the ITU Headquarters, Geneva, Switzerland. This event builds upon the tradition of annual WSIS May meetings, and its new format is the result of open consultations with all WSIS Stakeholders.
The Forum will offer participants a series of diverse interactions, including high-level debates addressing critical issues to the WSIS implementation and follow-up in multi-stakeholder set-ups, WSIS Action Line facilitation meetings, thematic workshops, kick-off meetings for new initiatives and projects, knowledge exchanges facilitating networking among the participants, and others. The forum will provide structured opportunities to network, learn and to participate in multi-stakeholder discussions and consultations on WSIS implementation.
China rejoices 17 may in Shanghai at the Expo Center of World Expo 2010. The WTISD-2010 theme is aligned with that of the World Expo: "Better city, better life", which represents the common wish of all humankind to achieve better living standards in future urban environments. ICT play a catalysing role in achieving this goal. Shanghai is also going to host the ITU World Telecommunication and Information Society awards, the awards focuses on the eminent personalities who have contributed to using ITCs in providing a better life in cities.
Bulgaria's Ministry of Transport, Information Technology and Communications (MTITC) chose to mark this year's celebration under the national initiative: “Manager for a Day 2010”, part of “Junior Achievement Bulgaria”, held on 22 March 2010 in Sofia. In general, the national initiative is related to inviting young people to share ideas for improving people’s lives in the capacity of managers of public administration for one working day.
In Uganda, Commonwealth Association, in partnership with other partners, CPAUG will be organizing a public dialogue on 17th May 2010 as part of the WTISD 2010 theme: "Enhancing the achievements of the Information Society in Uganda in building Better Cities" for about 250 participants. Partners: ICT Ministry, UCC, and I-Network.
Poland want to make ICT a household word in the country and to accomplish the goal this year a number of events, seminars, discussion panels and conferences will take place. Arab states are also showing there keen interest to make it a mark and AICTO, in cooperation with the Arab Towns Organization (ATO) and Tunis City, is organizing an international conference on the occasion of WTISD 2010. The conference will take place on 22 May 2010 in Tunis City and will include two panels and a roundtable related to this year's theme "Better city, better life with ICTs".
India is not far behind and to make this occasion memorable, The Maharana Pratap University of Agriculture and Technology from Udaipur in India, organizes a National Workshop on Computer Networking as well as a Celebration on the occasion of WTISD-2010. And apart from that the fifth World Telecommunication Development Conference (WTDC-10P), will take place in Hyderabad, India from 24 May to 4 June 2010 at the kind invitation of the Government of India. The objective of WTDC-10 is to identify priorities for the development of telecommunications and information and communication technologies (ICTs), taking into account contributions made by Member States and Sector Members, and to adopt the Hyderabad Action Plan (HAP) setting the future of activities of the ITU Telecommunication Development Sector (ITU-D) over the next four-year period.
With this kind of initiatives taken throughout the world it is quite obvious that all biggie across the world are taking the ICT expansion quite seriously and if it follows in the future it will have the tremendous impact. ICTs present innovative ways of managing our life- smart buildings, intelligent traffic management, provide solutions to many of the problems, direct to deal with the current scenario and help the cities to develop in an efficient and eco-friendly manner. Surely, the vision “Better city, better life with ITCs” will certainly come true, the only obligation is, every individual to do their bit in making it real.

Monday, May 3, 2010

ENVIRONMENTAL JOURNALISM

Environmental journalism is not something new, but at the same time it has something new to offer. As global warming, climate change etc. are seeking all limelight, its gaining its popularity. The collection, verification, production, distribution and exhibition of information regarding current events, trends, issues related to environment and making people aware of it. To be an environmental journalist, one must have an understanding of scientific language and practice, knowledge of historical environmental events, the ability to keep abreast of environmental policy decisions and the work of environmental organizations, a general understanding of current environmental concerns, and the ability to communicate all of that information to the public in such a way that it can be easily understood, despite its complexity.

Environmental journalism falls within the scope of environmental communication, and its roots can be traced to nature writing. One key controversy in environmental journalism is a continuing disagreement over how to distinguish it from its allied genres and disciplines.