4.24.2007

Make Time for Web Marketing 2.0


By Andrew Wetzler
E-Commerce Times
April 24, 2007

There are many opportunities to increase the flow of qualified visitors to a Web site by utilizing more advanced tactics. In order to be truly successful online ⎯ in addition to having a very good site ⎯ a company needs to effectively employ both search engine marketing (SEM) and search engine optimization (SEO).

SEO and SEM, although different, share a common thread. SEO is nonpaid traffic and its purpose of SEO is to make the site better for both search engines and usersIt's everything that can be done to utilize the technology of search engines with the goal of promoting a web site and increasing its traffic. It uses the practice of paying a search engine or a directory to add a site to its database immediately, rather than setting up that site so that it will be found by the search engine spiders on its own. Both efforts are driven by keywords, and the more well conceived the list, the more favorable the end result will be.


Key SEO Tactics

Search engine rankings are a function of sophisticated algorithms. While it is true that the algorithms are a moving target, which no one can fully explain, the factors that lead a Web site to perform well in the natural results are fairly consistent.

• The first issue is the degree to which a site contains hearty content that aligns closely to the keywords which are most important to the site.

• Limiting the number of keywords and phrases per page to two or three is also important. Incorporating content about more than two or three keywords onto any Web page will weaken your message on the page, which won't be viewed favorably by the search engines.

• Another step in making sites as search engine-friendly as possible is to ensure that the architecture doesn't present any obvious roadblocks. Flash is probably the best example of a technology that is much more aligned for the end user than for the search engine spidering process. It's reasonable to have some Flash on the site, but it's not a good idea, from a search engine ranking perspective, to have a home page that is all Flash. In general, the less, the better.

• Both content management systems (CMS) and template driven Web sites can be problematic for SEO. The problem with many of these frameworks is that the site owner has very little ability to make modifications to the coding of the site and the majority of templates are not SEO-friendly to begin with.

• Incrementally adding relevant links to your site is also a very good approach. Link popularity is an important component of the ranking criteria for the major engines, particularly Google.


Advanced SEM Methodology
There has been a dramatic improvement in the tools that are available to figure out what's working and what isn't. Variables include engine selection (MSN vs. Google, etc.), keywords (broad vs. narrow, exact/negative match, etc.), as well as factors like which pages of your site are moving a prospect toward your objective in contrast to others that may be hindering a conversion.

It used to be that the cost associated with analytics programs was substantial, therefore only the largest online players could justify the expense. With free and low-cost programs like Google Analytics, the investment is now limited to the time required to set the tools up properly (including coding the site) and the post click analysis of the data.

For more information, check out: http://www.technewsworld.com/story/9stHwNbCIanrWd/Make-Time-for-Web-Marketing-20.xhtml



The beauty of Internet marketing is its mass appeal. But these days it’s not only important to have a website. It’s important to utilize the advancing technologies to bring ads to people’s attention. Web sites have to be designed in a certain way so that they get ‘short-listed’ by search engines. Understanding how search engines work is probably the first main step all businesses need to comprehend, and then build or update sites that are in sync with the engines. This popularizes the website, which eventually leads to more traffic.

Web marketing has dramatically changed the way advertisers think and design their campaigns. And as the Internet becomes more and more enculturated throughout the world, it’ll be the best way to reach customers on a global basis. (not that it isn’t on its way already).

4.12.2007

How iTunes Works

How iTunes Works

itunes is not file-sharing. At its most basic level, iTunes is a music player that can play a wide variety of digital music, can also rip music from a CD to a computer to a computer, and create CDs out of digital music from a CD to a computer, and can create CDs out of digital music on your PC. iTunes also allows you to buy music from iTunes Music Store.

Using iTunes to get music on your iPod is probably the most popular way of using iTunes. However, you can also use your iTunes library with one of the Motorola/Cingular iTunes phones, which let you download up to 100 songs to the phone. Apple's wireless-networking hub, AirPort Express, is now "AirPort Express with AirTunes" -- you can wirelessly stream iTunes music from your computer to your hub-connected home-theater speakers. With this setup, you control playback via your computer. With another iTunes stream receiver, Roku's SoundBridge Network Music Player, you control everything through the SoundBridge remote control. So you're not limited to any single option when it comes to playback. But you are limited in some other ways.

You can use the iTunes Mac software with, say, a Creative Nomad MP3 player. But iTunes for Windows only supports the iPod -- if you connect a Creative Nomad to a Windows machine running iTunes, the software won't see it. There is no version of iTunes for Linux machines.

So iTunes (or at least the Mac version) does support other players besides the iPod. But here is gets even trickier: No matter what computer you use, you can't download (or stream) music you bought at the iTunes Music Store to a non-iTunes player. Music you download from the iTunes Store is protected by the Apple DRM (digital rights management) format, which is a proprietary, protected AAC file format that Apple doesn't license to anybody. The only devices that can play those files are ones with the ability to decrypt the Apple DRM, which includes your computer running iTunes, an iPod, an iTunes phone and your speakers connected to AirPort Express.
To play iTunes Music Store files on a portable player besides an iPod, you have to first burn them to a CD as MP3 files. The DRM encoding doesn't make it to the CD. You then rip the now-unprotected files back into your iTunes library and download them to the player.

Layton, J. How the Internet Works. Retrieved from www.howstuffworks.com, April 12, 2007.

To learn more about the software and iTunes interface, check out http://electronics.howstuffworks.com/itunes.htm


I support protecting the music industry, that’s why I like how music bought from iTunes can’t be burned onto a CD. But popularity of the iPod I believe has lessened the demand for CDs.

Overall, I love what you’re able to do with iTunes – organization and management capabilities, creating playlists, how it converts a lot of different files to fit iTunes, tag editing, and what not.

4.11.2007

“The Book Stops Here” by Daniel Pink


Wikipedia is a multilingual, web-based, free content encyclopedia project. Wikipedia is written collaboratively by volunteers from all around the world. With rare exceptions, its articles can be edited by anyone with access to the Internet, simply by clicking the edit this page link.

Wikipedia was founded as an offshoot of Nupedia, a now-abandoned project to produce a free encyclopedia. Nupedia had an elaborate system of peer review and required highly qualified contributors, but the writing of articles was seen as very slow. During 2000, Jimmy Wales, founder of Nupedia, and Larry Sanger, whom Wales had employed to work on the project, discussed various ways to supplement Nupedia with a more open, complementary project.

Using Wiki software, Wikipedia came about. Wikipedia differs from traditional encyclopedias in many ways. First, the articles are never complete. They are continually edited and improved over time, and in general this results in an upward trend of quality. Second, Wikipedia depends on “radical decentralization and self-organization.” Wikipedia is “self-repairing and almost alive.” Wikipedia’s ability to allow almost anyone to edit information follows a One for All Model, while traditional encyclopedias follow a One Best Way model, where after research and data are written, they tend to stay static.

Wikipedia relies on good faith and neutrality. But because it is an open source, it is susceptible to many problems. Vandalism being the main problem. But because of the defense mechanisms that are built into Wikipedia and because it has many dedicated users who readily change wrong and inappropriate content, “Wikipedia has an innate capacity to heal itself.” Not to mention, Wikipedia has administrators and bureaucrats to ensure that vandals get ousted. Another problem is credibility. Because there is no scholarly review of these articles, and that no one needs a doctoral degree to write or edit content, it is not considered a reliable source in the academics. However, Pink sates that it is difficult to evaluate Wikipedia with traditional encyclopedias. Wikipedia follows a different model, it’s free, it’s dynamic, and it does not claim to be infallible. Wikipedia is never finished, unlike traditional encyclopedias, which actually enhances its quality.



Wikipedia is a great source when you need to be an expert on a topic really quickly. And it’s free. I’ve compared information on Wikipedia to other ‘credible’ sources, and almost always the information matches quite perfectly. But since anyone can write and edit content on Wikipedia, I can understand why it’s not a valid resource in any academic field. Those who make a living out of doing research probably want special validation and respect for their work. And it’s only fair. Which is why most classes of all fields require scholarly work to be cited. But it doesn’t undervalue Wikipedia in any way. It’s a great forum for those who want to present their knowledge to the world, but don’t have all the information or expertise to complete it with perfection. Such information is then brought to the attention to those who care and it then molds into something that is of high quality and accuracy. It brings a community of people together who share the same passion for certain ideas and want to make it right for the world to see. It further adds to the jungle of information that’s out there, with an attempt to make topics as succinct and clear possible for the average person to understand. It’s a fabulous invention.

4.03.2007

Computer Hacking


A computer hacker is someone who lives and breathes computers, who knows all about computers, and who can get a computer to do anything. Some types of hackers are ethically driven. However, hacking is more commonly associated with criminality. Two recent hacks are listed below:

British Hacker Loses U.S. Extradition Appeal
By REUTERS
Published: April 3, 2007
New York Times

Gary McKinnon, a British computer expert, was arrested in 2002 following charges by U.S. prosecutors that he illegally accessed 97 government computers - including Pentagon, U.S. army, navy and NASA systems - causing $700,000 worth of damage.

McKinnon lost the right to appeal against plans to extradite him to the United States to stand trial as the judges stated that his actions were “intentional and calculated to influence and affect the U.S. government by intimidation and coercion.”

McKinnon is said to be the biggest military computer hack of all time, and could face charges of $1.75 million, and 70 years in jail.


TJX Says Theft of Credit Data Involved 45.7 Million Cards
By THE ASSOCIATED PRESS
Published: March 30, 2007
New York Times

Information from at least 45.7 million credit and debit cards was stolen by hackers who gained access to discount retailer TJX’s customer information in a security breach that they discovered three months ago. The theft, however, occurred in mid-July 2005, and the hackers obtained transaction information dating back to 2003.

TJX does not fully understand how the breach took place, who did it, and exactly how many cards were affected because it deleted much of the transaction data in the normal course of business between the time of the breach and the time that TJX detected it. But at least half a million customers had their personal data stolen.

But last week six people were caught using credit card numbers that are believed to match the TJX customer data. The six people purchased $1 million worth of gift-cards at Walmart. TJX has not yet confirmed the validity of the credit card matches.


__

Computer hacking is an art. Yes, it causes millions of dollars in damages and can harm national security. But it’s aesthetic to create programs that can bypass heavy security measures companies and government agencies place. And that’s why great hacks go into the “Hackers Hall of Fame.”

4.02.2007

WiMax in India

WiMAX Steps Forward in India

India is increasingly embracing wireless technologies. WiMAX has the potential to provide India with widespread Internet access that can usher in economic growth, better education and health care and improved entertainment services as it has done elsewhere in the world.

C-Dot Alcatel-Lucent Research Centre (CARC), in Chennai, has successfully completed the country's first live WiMAX IEEE 802.16e-2005 (also called Rev-e) field trial using Aircel’s (a mobile service provider) licensed spectrum. The technology is now ready for commercial deployment. An Alcatel-Lucent statement said the trials were conducted using the 2.5GHz band and successfully demonstrated applications in moving conditions such as video streaming, high-speed file downloads, Voice over IP (VoIP) and Web browsing.

Aircel had launched WiMAX-based on 802.16d standard services in Chennai, during October 2006, and in Bangalore during January 2007. It had also deployed WiMAX networks with limited coverage in many prominent cities such as Coimbatore, Pune, Delhi, Cochin & Ahmedabad and soon will be providing pan-city coverage at these locations in a phased manner.

Although Alcatel-Lucent and Aircel are driving the WiMax technology forward, it was Intel that initiated and first started helping service prdoviders in India to carry out successful trial programs, and continues to do so today.

Feb 21, 2007. WiMAX Steps Forward in India. CXOtoday.com.



I think wimax in India is a great idea. High-speed wireless broadband technology based on WiMAX promises an economically viable solution to accelerating the Internet adoption that can completely change lifestyles in India. Countries advance and can go high-tech only when communication systems within a country are up to date. And high speed Internet with better quality and service is an effective way for people to stay connected and also spread knowledge and information.

WiMAX is also going to be great for India because reliable Internet service for the masses can spur growth in e-Commerce and increase the number of investors participating in economic activities such as online stock trading.

Not to mention that schools and libraries in rural or remote areas without wired infrastructure can be connected to broadband, and can have broadcast lectures from urban schools and universities. This will help enhance the education system in India, if the schools effectively utilize the resource.

Overall, this is going to be a great way to spur business development in India and add to the country’s booming economy.

2.23.2007

Google/Earthlink in San Francisco


Last year Google and Earthlink won a joint bid to provide free Wi-Fi access to San Francisco. But some want to put the clamps on the deal and start over, according to David Needle in his article, “Hey Google: Hold Up on Wi-Fi By The Bay.” Activist groups are protesting to the Wi-Fi plan as they want the city to use the extra capacity in its existing high-speed fiber optic network as the “backbone to build a truly modern, fast, and free, public communications system.” They want to see the city install as much fiber as possible as well as Wi-Fi.

The speed of service that Google will be providing would be 300 kilobits per second, which is much faster than dial-up Internet service but slower than some broadband. However, activists want a higher speed option, especially since the entire Internet infrastructure is moving towards broadband fiber. They complain that the deal gives Google/Earthlink a monopoly deal when a city-run operation could provide free Internet service at speeds at least ten times faster.

At this time, Google’s project is being delayed as they need more wireless antennae. However, the activists are glad because they want to pursue a better plan that won’t keep San Francisco behind the rest of the world.


Hmmm…When did Google start providing Internet service? I am quite surprised how a large city has contracted with Google, who has very limited experience in this area. But I’m also wondering how difficult of a job this may be for Google – or for any company who would have won the bid. San Francisco is a big city and has lots of hills, and I bet it must be difficult to thoroughly blanket the city with Wi-Fi coverage. It’s no doubt that Google is having delays, as they probably need more antennae then they anticipated.

But there are definitely its advantages and disadvantages with this project. It’s great that everyone in San Francisco will get free Internet service, no matter where they are within the city limits. However, the disadvantage of this is that a free service can easily become less sustainable in the long run. If no one is paying for it, it will be difficult to upgrade the service and networks. But perhaps from advertisement is where Google will make its money.

And my last thought is, why did San Francisco need or want to create this city Internet plan? It’s a metropolitan city, and I’m sure most people have Internet access, many Wi-Fi hotspots, and overall widespread broadband availability. Wouldn’t they be ‘slowing’ the city down with this 300 kbs system?

2.17.2007

Why the Web Is Like a Rain Forest

The Web is moving from a system where you had to find information on your own – by stumbling across it or someone sending you a link. Today, according to Steve Johnson, the Web is an “information ecosystem.” In Johnson’s essay, “Why the Web Is Like a Rain Forest,” he compares the rain forest to the Internet. As the sun supplies the energy to a rain forest, information supplies the energy to the Web’s ecosystem. Web 1.0 used and disseminated information to a limited amount of people – whoever could a hold of the information, and it was like supplying energy to a desert. Today, with Web 2.0, information is no longer wasted. A person’s thoughts or ideas, discoveries, or news stories are no longer just passed along by default. Web 2.0 has individuals or software programs that absorb all the information on the Web. They take in information, analyze it, package or repackage it, and pass it along different venues where millions have immediate access to it, and the ability to transmit it to others. Information, therefore, is also the nutrients of the forest. The nutrients can be consumed by all, helping everyone to grow and add value to the existing system. The growth is the further expansion of information, by those who use the information and the existing tools to develop more innovations.

I definitely think that Johnson’s analogy is apt. The Web really is a jungle of information that keeps expanding with the help of new technology. Thanks to e-mail notifications that one receives from blog sites, bulletin board discussions, or websites that hold information for specific groups of people, or any site that has anything relevant to someone, everyone can stay updated on just about everything. Software programs have made it so much easier for everyone to stay connected – in a far greater sense that the originators of the Internet must have imagined. There is so much information out there and so many ways to get this information, that it’s hard to think what greater innovations there can be to outdo the existing ones. Or with the heavy amounts of information being sent to us, how can we retain everything? A jungle gets clouded with so many trees and things living in it. When things get convoluted, it’s difficult to see things clearly and in isolation. That’s how the Web is. If you Google a certain topic in, you get millions of hits. Some sites say something, while others say something else. With so much information coming at you, it can get difficult to sort out what you should or shouldn’t take in and use…