Saturday, June 18, 2011

Best Graphics Cards Under 10K

Holiday season is fast approaching and we realize our trigger happy readers are aching to gear up for the upcoming onslaught of eye candy rich games. 2011 is shaping up to be a really good year for PC games with Bulletstorm and Dead Space 2 taking charge; we have other great titles like Crysis 2, R.A.G.E, NFS Shift 2 to look forward to. While these games won't be too difficult to run even on modest systems since all the games are multi-platform, choosing the right card will make all the difference in the overall gaming experience. Looking at the market, we've noticed some companies have really aggressive pricing compared to others, for instance for AMD cards, Sapphire and MSI offer you the lowest rates and for Nvidia, we have ZOTAC and MSI again. It's sad to see other companies like Asus and XFX come up with absurd pricing which makes you wonder how they manage to make any profit in India. Even though their cards are of good quality and all, at times, the premium is just too high which makes no sense in investing that kind of money.

Today, we'll be taking a look at some great deals that we've managed to find under Rs. 10,000. While we all want fancy expensive cards, the reality is we don't really need them especially if you game on anything lower than a full HD resolution. We all saw that Crysis 2 is easily playable on a 9600GT on "Gamer" settings up to 1920x1080 so going by that alone, the cards we've picked today will easily handle all the upcoming games unless of course we get some shitty console port, then it's hardly our fault. One more thing before we get started, the list is made based on Mumbai prices so if you find a better deal elsewhere, holler in the comments section as it could help others in your city.

Under Rs.3,000 - Stick to Onboard
If you're on a budget that's anything under 3K then it's advisable you don't buy anything. I'd say work a bit harder and jump straight to our second category. The problem is that there isn't any worthy card at all under this price bracket that would give you playable frame rates. The cards under 3K are designed for HTPC use which means they are slightly better than onboard graphics, enough to offload 1080p videos and provide a wide array of multichannel audio options via HDMI.

Under Rs.5000 - Sapphire HD 5670 512MB/1GB
Our under 5K still remains the HD 5670. Sapphires offering come with a nice cooler and a larger fan for better and more silent cooling. Here you can buy either the 512MB or the 1GB version as the price difference is a couple of hundred bucks. The extra video RAM is nice to have although it's not needed much. The 1GB version is available for Rs.4,700 while the 512MB can be bought for a little less for Rs.4,400.



Under Rs. 6000 - MSI N250GTS-2D512 512MB
Yes, the G92 chip is still going strong and for a smidge under 6K, it's currently the best buy. MSI has shed the reference cooler and gone for something bolder using their own custom heatsink. The GTS 250 is built using 55nm fabrication and is the second coming of the 9800GTX+. It has the full 128 shaders of the G92 core and comes with faster clock speeds. The 9800GT is also selling for around the same price and we strongly advise you to stay away from it as it has the crippled G92 core same as the 8800GT. There's also the HD 4850 for those who are interested in an AMD solution but we'd stick to the GTS 250.




Apple ipad 2 Preview

Apple certainly likes to make the few people in India who like their products wait. Even then we somehow manage to get our hands on their products before they launch them here. Last year we were one of the first in India to get our hands on an iPad and publish our hands-on experience with it. This year, as luck would have it, our Head of HR decided to get an iPad for herself from the UK and was glad to lend it to us for a preview. Of course, we know better than to keep an owner away from their new gadget for long, so we only had enough time to do a quick preview like last time instead of a full review. Still, what follows is a sufficiently detailed description that should give you an idea of what exactly the new iPad 2 is all about.
 

iPad 2


Design
The iPad was never particularly thick but it's only after you hold the iPad 2 do you realize just how thick it really is. The iPad had a curved back and almost a centimeter thick edge. The iPad 2 has a flat back and almost no side edge at all. Instead, the back just curves upward to meet the front side, just like on the fourth generation iPod touch.
 
iPad 2


This does have a considerable effect on the ergonomics. While the new design worked against the iPod touch, it improves the feel of the iPad 2 considerably. The iPad 2 feels much better in hand and also gives you a better grip. The weight is also better distributed on the iPad 2 unlike the iPad, where the bulging center felt heavier than the sides. Apple has also reduced the overall weight, however, it still feels a bit heavy for extended single-handed usage. The problem, more than the weight itself, is the width of the device, which spreads the weight away from your hand, thus putting more strain on your wrist.
 
iPad 2


When viewed from the front, the iPad 2 again looks smaller. It lacks the substantial metal border found on the iPad and in case of the iPad 2 the glass now seems to go almost till the edge. There is still a significant bezel around the display, necessary if you are to hold the device without touching the touchscreen. You may also notice the FaceTime camera above the display, with the ambient light sense located slightly above it.

Motorola XOOM Launching in India

The much-hyped Motorola XOOM is now headed towards India. Reports suggest that the Motorola XOOM will make its way towards the country by the end of this month. The Motorola XOOM tablet was originally launched this past February in the United States as the first ever Honeycomb device. Honeycomb is a version of the Android operating system by Google, specially optimized for tablets. While some may feel that the XOOM is a bit late to enter the Indian Honeycomb tablet scene, with tablets like the Acer Iconia Tab already available in the market, the XOOM will still have the privilege of being the first ever Honeycomb tablet.

The Motorola XOOM

The Motorola XOOM

The Motorola XOOM features a 10.1" display with pixel dimensions of 800x1280 pixels, and it comes along with a 5 MP rear camera capable of recording 720p and a 2 MP front facing camera for video calling. It is powered by a 1 GHz dual core NVIDIA Tegra 2 and there's 1 GB of RAM on board to complement it to aptly take care of the tablet's performance aspect. There are quite a few connection options too with an HDMI port and a microSD card slot. The tablet will be available in 16 GB and 32 GB variants, though details about the prices are very scarce at the moment.

The Acer Iconia Tab A500 already available in the market offers similar functionality with very few differences, but it has been criticized for having a poor build quality. Still, there's very little to differentiate between the Acer Iconia and the Motorola XOOM. Also, the Samsung Galaxy Tab 10.1 will soon be launched, making it even harder for you to choose. The Galaxy Tab 10.1 will come packed with Android 3.1, whereas XOOM comes with Android 3.0 though an update is promised by Motorola. The XOOM has a superior 5 MP camera, while the Galaxy Tab 10.1 has a 3.15 MP

Sunday, March 6, 2011

What is WCDMA

WCDMA stands for Wideband Code Division Multiple Access. It is one of the main systems used for third generation, or 3G, mobile communication networks. The term is often used interchangeably with UMTS, which stands for Universal Mobile Telecommunications Systems. Technically WCDMA is merely one example of UMTS technology.
The most prominent use of WCDMA is in Japan where the country's largest mobile phone operator, NTT DoCoMo uses the technology. It was DoCoMo which originally developed WCDMA. The firm then successfully lobbied for it to be accepted as an international standard.
WCDMA is used across the world in dozens of countries. It is most prominently used in Asia and Europe. Outside of Japan, the UMTS name is generally used for marketing the system.

The WCDMA system combines two main types of mobile phone technology: CDMA and GSM, which stands for Global System for Mobile communications. In the United States, most cellphone network providers use only one of these two technologies. One of the main reasons WCDMA has struggled to get a foothold in the US is that it uses two channels, each covering 5Mhz. This is a relatively large "chunk" of the airwaves, which has been problematic as the US was slow to allocate new frequencies specifically for 3G systems.

There are several key advantages to WCDMA. One is that each transmitter is assigned an identification code. This means that data from multiple transmitters can be carried over the same frequency in the same geographical area at the same time without interference or loss of signal strength.
The system also uses power control. This adjusts the strength of the signal transmitted by each cellphone so that it reaches the nearest transmitter at the same strength, regardless of how far away the phone is. This avoids the transmitter receiving signals which are excessively strong or weak, which could limit the transmitter's efficiency
 

What is 4G UMTS

Fourth-Generation universal mobile telecommunications system, often abbreviated 4G UMTS, is a wireless telecommunications data transfer standard. Though there are a number of devices that claim to use 4G UMTS, the original standards set by the International Telecommunication Network are not yet met by these devices. 4G UMTS uses many of the same devices and much of the same infrastructure as Third-Generation UMTS (3G UMTS).
UMTS, sometimes referred to as wideband code division multiple access (WCDMA), uses Internet protocol (IP) technology to connect wireless users with the Internet. First developed in the 1990s, UMTS is a reliable network that is frequently used to transmit data and voice. Mobile phones, laptop computers and other devices can connect to the Internet and make voice calls over a UMTS system.

Though not yet in wide use as of 2011, 4G UMTS calls for significant speed increases over the UMTS standard, which has been used since 2001. 3G UMTS requires that data be transferred at a peak rate of at least 200 kilobytes per second. In 4G UMTS, data must download at a rate of 100 megabytes per second in mobile devices and at 1 gigabyte per second for electronics connected to a local wireless access hub. Both 3G UMTS and 4G UMTS require the simultaneous transfer of voice and data, which was a requirement first established during the switch from second generation to third protocol. 3G UMTS and 4G UMTS can both transfer information using the same infrastructure.
A number of pre-4G devices have been on the market starting as early as 2006. Though these devices are not up to 4G standards, they are considerably faster than the 3G standard. In December 2010, the International Telecommunication Union, which is managed by the United Nations, determined that some of the pre 4G technologies already on the market could label themselves as 4G even though they did not quite reach the levels required by the standards. 

4G UMTS is one of a number of standards that wireless carriers can use. It is a common choice among carriers, mainly because the infrastructure has been around for a long time. Ultra Mobile Broadband (UMB) is a Qualcomm CDMA implementation that offers higher speeds than 3G UMTS though less than 4G UMTS. Many companies were dissatisfied with aspects, however. These companies may choose to work with UMTS instead of UMB as the 4G technologies and systems are released.

What is 3G Broad Band

Third generation (3G) broadband, primarily used by telecommunications providers and their customers, offers mobile Internet through specifically enabled devices. Devices that access 3G broadband have the capacity to browse websites, download content, and access other Internet services from a mobile location. At the time of its release, 3G broadband proved to be unrivaled in terms of mobile Internet technology.
3G utilizes wireless technology standards such as Enhanced Data rate for Global System for Mobile communications Evolution (GSM Edge), Universal Mobile Telecommunications System (UMTS), Code division multiple access (CDMA), and Worldwide Interoperability for Microwave Access (WiMAX) to create high speed data and voice mobile networks. Unlike its predecessor 2G, 3G presents voice and data services at high rates, coupled with the ability to use voice and data simultaneously. 3G download speeds reach as high as 14.4 Mbit/s while upload speeds reach as high as 5.8 Mbit/s. 3G standards are defined by the International Telecommunication Union as IMT 2000, with an aim to facilitate growth, increase bandwidth, and support diverse applications.

3G broadband was first released in 2001, with WiMax introduced to the telecommunications market in 2007. Upon 3G broadband's initial release, it was launched by NTT DoCoMo in Japan. Following this launch, 3G spread to other countries in Asia and Europe before reaching the United States, where the first operator was Monet Mobile Networks, followed by Verizon Wireless, which launched 3G in October 2004. Generally speaking, the 3G broadband market is still monopolized by cell phone providers, who offer the services primarily to cell phone users.

3G broadband data rates vary depending on the location and provider, but is still a significant improvement from 2G. This previous generation was noted for slow data transmission. In addition to rate improvements, security levels have been intensified in 3G broadband.
Aside from cellular phones, 3G broadband has also become accessible through much smaller devices, called dongles. Dongles connect other devices, such as computers, to mobile Internet services. 3G broadband technology has yet to reach wide spread use and coverage, and also remains surpassed by conventional Internet for bandwidth levels
 

Friday, February 11, 2011

What is Wireless Datacard

Wireless data cards have taken mobile Internet connectivity and workforce mobility to a different level. These devices allow you to access the Internet wirelessly with broadband connectivity speeds in virtually any location. The access technologies are through the coverage area provided by the carrier or telecommunication provider offering the service. Wireless data cards fit into the USB or card slots of a laptop computer.

    Anywhere, anytime Internet access

  1. Wireless data cards allow you access to the Web whenever and wherever you want and without having to worry about WiFi hotspots or the expensive tariff rates of hotel Internet service providers.
  2. Access Technologies

  3. Access to the Internet is through the network coverage of the relevant telecommunication company provider via Code Division Multiple Access (CDMA), Evolution-Data Optimized (EVDO) or High-Speed Packet Access (HSPA) technologies.
  4. Enhanced Productivity

  5. Entrepreneurs, media/entertainment executives, working professionals can work anywhere, anytime and access the Internet at high speeds for work or entertainment purposes. They can access email, enterprise applications, stream videos, download applications, images and graphics-rich documents etc., on the go.
  6. Broadband Internet Connectivity

  7. High-speed Internet connectivity is another advantage of wireless data cards. Broadband speed generally averages between 600 Kbps to 1.4 Mbps.
  8. Carriers Offering Wireless Data Cards

  9. Verizon, AT&T, Sprint and Alltel are the major carriers offing wireless data cards. Users can choose from the different monthly service plans available.

What is Blackberry Technology

What Is Blackberry Technology?

In all honesty, there’s nothing overwhelmingly terrific about the Blackberry device itself. PDA devices had cellphone and Internet capabilities integrated into them long before the Blackberry became a household name. The first versions of the Blackberry weren’t leaps and bounds ahead of the leading hand-held mobile devices of the day.
With that said, the Blackberry technology itself was a major advancement in mobile technologies, and it’s the Blackberry service that ultimately led to the popularity of this device. It’s the ability of this service to keep you instantly connected to your home or office network that made it into the leading mobile devices today – with its own cult following of avid enthusiasts and users.
So what exactly is Blackberry Technology, and what makes it so special? It’s called Push Technology.

 

Push Technology Compared to the PDA “Pull” Method

There are different configurations of the Blackberry service, but the most common enterprise setup is with the Blackberry Enterprise Server (BES). The entire purpose of this server is to keep all Blackberry users instantly updated the moment any “data event” occurs (my own terminology).



To understand blackberry technology, it’s important to first understand how the PDAs operated before the Blackberry came along. While most of the discussion centers around email, I like to refer to “data events” as anything that requires an update on your mobile. In the days before PDA became Internet enabled, you would basically do your work while mobile and then when you get back home or to the office, you connect a cable between the device and your PC and do a “sync.” During a sync, changes on your device get uploaded to your email or calendar accounts and any new emails or calendar changes at the accounts get loaded onto your PDA. Updates would take place once a day – or whenever you have time to sync the device.
However, once PDA devices came integrated with Internet access via cellular data networks, you could sync certain “data events” with your various accounts without having to physically connect to the server. This is illustrated in the figure above. When you want to retrieve new emails on your POP3 or IMAP email account, you tell the PDA to go out and retrieve new emails. The path of data transfer starts with the PDA (the red arrow), and through the Internet-enabled mobile email software which connects with and talks to the email server over the Internet. After retrieving all new emails, the connection to the server is disconnected and new emails are displayed.
If you want to reply to one of the emails, you type your reply on the PDA and click send. The cycle starts all over again, with the PDA always initiating communication and requesting updates. Because of this, if you haven’t configured your mobile device to automatically retrieve emails, you could go hours without receiving an important email.

Push Technology Keeps You Instantly Updated All The Time

 

Now, let’s take a look at how synchronization takes place with Blackberry Technology.


When you’re using a Blackberry device, you can picture your handheld unit as a network device, like a PC on a LAN, that’s always connected to your data account through the redirector software. Instead of residing on your mobile device, the software (whether it’s BES or desktop redirector software) is installed on the Blackberry Enterprise Server.
Communication between the software and the server, as well as the software and your Blackberry, is always a two-way street. Whenever there’s a change in your email account, calendar or any other monitored account on either the Blackberry (if you create and send an email) or on the office network (your secretary updates your calendar with a new appointment), the BES immediately updates either the mobile device or the email or calendar account. In other words, Blackberry push technology keeps the device constantly and instantly in “synch” without any effort on your part.

What Is So Special About Blackberry Technology?

 

In all honesty, a PDA running Windows Mobile could be configured in much the same way as the BES service. There are new “sync” options being offered every day that can help you to keep your Internet-enabled Windows mobile device instantly up to date. For example, Aibek mentioned a few great sync tools such as ShifD, Mobical or OggSync that will do just that. Another cool method is Karl’s use of Mail2Web to sync up his device with email. Or, it could be as simple as using GoogleSync to stay synched up.
The fact is, Blackberry Technology was novel when it first came out, however the subsequent lawsuit with NTP, that claimed RIM used its technology already in use for PDAs, was only the first indication that while the technology is certainly effective and valuable – it isn’t rocket science. The device is now little more than a status symbol – a way for someone to feel like they’re on the cutting-edge of cellular technology. But in a few years, everyone will be instantly connected to their email, calendar and social networks – and it will be on to the next great thing.

What’s your opinion of Blackberry Technology? Do you think it stands up to all of the hype? Share your own point of view in the comments section below.

Bluetooth Technology

Bluetooth wireless technology is a short-range communications technology intended to replace the cables connecting portable and/or fixed devices while maintaining high levels of security. The key features of Bluetooth technology are robustness, low power, and low cost. The Bluetooth Specification defines a uniform structure for a wide range of devices to connect and communicate with each other.
The structure and the global acceptance of Bluetooth technology means any Bluetooth enabled device, almost everywhere in the world, can connect to other Bluetooth enabled devices located in proximity to one another.

Connections betweeen Bluetooth enabled electronic devices allow these devices to communicate wirelessly through short-range, ad hoc networks known as piconets. Piconets are established dynamically and automatically as Bluetooth enabled devices enter and leave radio proximity meaning that you can easily connect whenever and wherever it's convenient for you. 

Each device in a piconet can also simultaneously communicate with up to seven other devices within that single piconet and each device can also belong to several piconets simultaneously. This means the ways in which you can connect your Bluetooth devices is almost limitless.

A fundamental strength of Bluetooth wireless technology is the ability to simultaneously handle data and voice transmissions. which provides users with a variety of innovative solutions such as hands-free headsets for voice calls, printing and fax capabilities, and synchronization for PCs and mobile phones, just to name a few.
The range of Bluetooth technology is application specific.  The Core Specification mandates a minimum range of 10 meters or 30 feet, but there is no set limit and manufacturers can tune their implementations to provide the range needed to support the use cases for their solutions.

.Net Frame Work

This article will help your in understanding .NET and .NET architecture.
 
What is the .NET Framework?
 
The .NET Framework is a new and revolutionary platform created by Microsoft for developing applications 
  • It is a platform for application developers.
  • It is a Framework that supports Multiple Language and Cross language integration.
  • IT has IDE (Integrated Development Environment).
  • Framework is a set of utilities or can say building blocks of your application system.
  • .NET Framework provides GUI in a GUI manner.
  • .NET is a platform independent but with help of Mono Compilation System (MCS). MCS is a middle level interface.
  • .NET Framework provides interoperability between languages i.e. Common Type System (CTS) .
  • .NET Framework also includes the .NET Common Language Runtime (CLR), which is responsible for maintaining the execution of all applications developed using the .NET library.
  • The .NET Framework consists primarily of a gigantic library of code.
Definition: A programming infrastructure created by Microsoft for building, deploying, and running applications and services that use .NET technologies, such as desktop applications and Web services.
 

Cross Language integration

 
You can use a utility of a language in another language (It uses Class Language Integration).
 
.NET Framework includes no restriction on the type of applications that are possible. The .NET Framework allows the creation of Windows applications, Web applications, Web services, and lot more.
 
The .NET Framework has been designed so that it can be used from any language, including C#, C++, Visual Basic, JScript, and even older languages such as COBOL.

Wednesday, February 9, 2011

What is Interanet

An intranet is usually described as an internal or restricted access network that is similar to functionality as the internet, but is only available to an organization internally. For instance if your organization would like to share specific information such as documents, any current announcements, new product details etc, but only allow those computers in the organization access to this information, you would use an intranet.
In order to have an intranet, the computers in the network do not have to have a normal internet connection. However, since most organization have both an intranet and internet access (sometimes called extranet), the organization will provide a gateway such as a firewall, along with other types of ways to identify the user such as authentication or encryption data or the use of VPN (virtual private networks). With these added provisions, individuals with clearance to access the intranet from outside sources, using the internet can. For instance an off site employee can have access to the intranet and be able to download specific reports or data.

Advantages of an Intranet

Intranets can enhance the productivity at an organization. They can be used for many things dealing with communication. For instance, intranets can be helpful to organizations large and small, by giving it the ability to use intranets as delivery mechanisms for applications, drivers and collaborative projects.
An intranet can also help associates find data quickly and easy through a browser interface. For instance, your organization might have medical insurance information on the intranet, which workers can easily navigate and access. This can reduce the amount of time that it takes to contact an individual at the HR Dept.. Instead, information is at the fingertips of all associates. Another great way intranets can enhance productivity is that information is available when a worker needs it, not just when people with the information send it out via email.

Disadvantages of an Intranet

While for the most, an intranet is very advantageous to any organization, there are a few downsides including the fact that management does need to give up control of specific information. While usually this problem can be minimized with proper foresight, problems do occur.
Security issues might be another disadvantage with an intranet. For instance, an employee might have posted sensitive information for all employees to see. Another issue might be the fact that there is too much information. Information overload does exist and can take place when too much data is up on the intranet. This makes it very difficult for employees to navigate and find data that is meaningful or that they need.

Wednesday, February 2, 2011

Sound card Installation

First of all you need to be clear about why you are setting out on installation of sound card. If you are facing sound card problems, try some other solutions before you set out to replace it. But if your computer does not have a sound card at all, then you surely need to install it to enjoy all your multimedia features on the computer.

Go to a computer repair store and purchase a new sound card before you begin. Do some research before you carry out this purchase, and buy a sound card that belongs to a well reputed and reliable company. This will cost a few more dollars, but it will ultimately be worth it. If you carry out sound card installation with an inferior quality sound card, it will wear out faster and cause more problems in the future. Keeping all this in mind, let's now see how to install sound card.

  • To begin with, you must go through the user manual that you received with your computer and read all that you can about opening up the computer and also about the sound card. The more you know about these, the better it will be for you.
  • Now you must ensure that the drivers that come with the sound card are installed on your computer. When you purchase the sound card you will receive a CD containing the drivers along with it. Install these on your machine before you go ahead and physically install sound card. Alternately, you can even download these drivers from the website of the sound card manufacturer.
  • Now switch off your computer completely, and turn off all power switches. Remove the side of the computer by unscrewing all the visible screws.
  • Locate the old sound card, and unplug everything that is connected to it. Do not blindly pull out the sound card, you must carefully note all the cables that are attached to it and then release them one by one before you actually pull out the old sound card.
  • Now insert the new sound card into the same slot in the same manner, and proceed to reconnect all the wires and cables that were attached to the old one, including those of the speakers if they were attached to the old sound card.
  • Now close the side of the computer and switch on the machine.
You have now successfully managed to install sound card, and if you have done all this right, it will work instantly. Try playing an audio file to check if the sound card is functioning properly. In case it is not, it could either be a hardware problem or a software one. Installing the drivers properly is an integral part of sound card installation, and if this is not done properly, the sound card will not work.