Sunday, June 13, 2010

SEO and SEM Insiders Call Mozenda, "The Best Thing Since Sliced Bread"

The blogosphere is abuzz with it: Mozenda, the sexy new software that structures Internet data to custom specifications, is the best idea since sliced bread. And thanks to social media masters Will Critchlow and Sam Crocker, it won’t take nearly as long to catch on.

It took fifteen years for Otto Rohwedder’s bread slicing machine to gain wide popular acceptance, but it was only two months ago that Will Critchlow, while chatting with friends in a London pub, first heard about a small Utah-based company that was structuring Internet data. The words “structured Internet data” are enough to make most people’s eyes glaze over, but Critchlow, who works at the Internet marketing firm Distilled, couldn’t get it off his mind. What could he do with all of the Internet’s data at his whim?


I think the reason I gained as much exposure as I did from this conference was that I offered something new Critchlow looked into Mozenda, and upon discovering its powerful capabilities, he did something odd: he shared it with the world in a SEOMOZ.org article titled “Data Visualization Techniques.” "I am massively impressed by the company’s software, and their support,” Critchlow wrote. “I am using it at the moment to gather social network data for a piece of client research. Mozenda gathers and organizes data very quickly… It is saving me enormous amounts of time."

Many people would have kept the discovery of a tool as powerful as Mozenda to themselves, using it to gain an edge over their competitors, but Critchlow was following a new philosophy in his line of business: sharing. Sharing builds publicity and garners respect within the industry, something Critchlow and Distilled have in no short supply.

Critchlow’s co-worker Sam Crocker jumped on the bandwagon, giving a presentation at the Search Marketing Expo in London about how to gain insider knowledge by using Mozenda to gather Google keyword search data. Grateful attendees began raving about the new technology—and Crocker—in earnest.

"Mozenda was a very smart thing for me to talk about,” Crocker said, when asked why he had shared the secret of his “atomic bomb” at the SMX conference. “I know some people like to keep their information to themselves, and I don't blame them for that. For me, the key is coming up with creative ways to use existing tools, even if it means using them for tasks other than those for which they were originally designed.

“I think the reason I gained as much exposure as I did from this conference was that I offered something new,” Crocker continued. “Something that most people didn't know about. I've been taught since day one at my company Distilled.co.uk to be as open as possible and share both within the team and within the industry. I think it's what makes us good at search engine marketing."

Mozenda CEO Brett Haskins is still a little shell-shocked by all the publicity his product is receiving. "I am amazed by people's creativity in using our tools. I am starting to see more and more applications that are very clever. When building our Internet Data Mining tool we did not specifically go after the Search Engine Marketing industry. What Sam Crocker did in his presentation was not something we initially imagined could be done with Mozenda. Sam broke new ground in how he is mashing up data. We are excited to receive his attention."

Just like sliced bread, Mozenda has set a new industry standard, providing an easy and inexpensive way to gather and organize Internet data. It’s too bad Otto Rohwedder didn’t have Critchlow, Crocker, or the Distilled team talking about his bread slicer—instead of languishing in obscurity for fifteen years, it would have moved out of the pub and into the marketplace at Twitter speeds.

for more information on Mozenda's data gathering tools can find them on the web at http://www.Mozenda.com/.

SEO and SEM Insiders Call Mozenda, "The Best Thing Since Sliced Bread"

The blogosphere is abuzz with it: Mozenda, the sexy new software that structures Internet data to custom specifications, is the best idea since sliced bread. And thanks to social media masters Will Critchlow and Sam Crocker, it won’t take nearly as long to catch on.

It took fifteen years for Otto Rohwedder’s bread slicing machine to gain wide popular acceptance, but it was only two months ago that Will Critchlow, while chatting with friends in a London pub, first heard about a small Utah-based company that was structuring Internet data. The words “structured Internet data” are enough to make most people’s eyes glaze over, but Critchlow, who works at the Internet marketing firm Distilled, couldn’t get it off his mind. What could he do with all of the Internet’s data at his whim?


I think the reason I gained as much exposure as I did from this conference was that I offered something new Critchlow looked into Mozenda, and upon discovering its powerful capabilities, he did something odd: he shared it with the world in a SEOMOZ.org article titled “Data Visualization Techniques.” "I am massively impressed by the company’s software, and their support,” Critchlow wrote. “I am using it at the moment to gather social network data for a piece of client research. Mozenda gathers and organizes data very quickly… It is saving me enormous amounts of time."

Many people would have kept the discovery of a tool as powerful as Mozenda to themselves, using it to gain an edge over their competitors, but Critchlow was following a new philosophy in his line of business: sharing. Sharing builds publicity and garners respect within the industry, something Critchlow and Distilled have in no short supply.

Critchlow’s co-worker Sam Crocker jumped on the bandwagon, giving a presentation at the Search Marketing Expo in London about how to gain insider knowledge by using Mozenda to gather Google keyword search data. Grateful attendees began raving about the new technology—and Crocker—in earnest.

"Mozenda was a very smart thing for me to talk about,” Crocker said, when asked why he had shared the secret of his “atomic bomb” at the SMX conference. “I know some people like to keep their information to themselves, and I don't blame them for that. For me, the key is coming up with creative ways to use existing tools, even if it means using them for tasks other than those for which they were originally designed.

“I think the reason I gained as much exposure as I did from this conference was that I offered something new,” Crocker continued. “Something that most people didn't know about. I've been taught since day one at my company Distilled.co.uk to be as open as possible and share both within the team and within the industry. I think it's what makes us good at search engine marketing."

Mozenda CEO Brett Haskins is still a little shell-shocked by all the publicity his product is receiving. "I am amazed by people's creativity in using our tools. I am starting to see more and more applications that are very clever. When building our Internet Data Mining tool we did not specifically go after the Search Engine Marketing industry. What Sam Crocker did in his presentation was not something we initially imagined could be done with Mozenda. Sam broke new ground in how he is mashing up data. We are excited to receive his attention."

Just like sliced bread, Mozenda has set a new industry standard, providing an easy and inexpensive way to gather and organize Internet data. It’s too bad Otto Rohwedder didn’t have Critchlow, Crocker, or the Distilled team talking about his bread slicer—instead of languishing in obscurity for fifteen years, it would have moved out of the pub and into the marketplace at Twitter speeds.

for more information on Mozenda's data gathering tools can find them on the web at http://www.Mozenda.com/.

Wednesday, May 26, 2010

Letter From Facebook

Hi Jon,
Facebook will roll out changes today that will make it easier for our users to understand and control their privacy settings. As this change will have an impact on our users, we wanted to let you, a valued advertising partner, know about it. Please note that this change will not affect your advertising campaigns and there is no action required on your part.



Facebook is a company that moves quickly, constantly innovating and launching new products to improve the user experience. The feedback we heard from users was that in our efforts to innovate, some of our privacy settings h ad become confusing.
 


We believe in listening to our users and taking their feedback into account whenever possible. We think the following changes address these concerns by providing users with more control over their privacy settings and making them more simple to use.



Starting today, Facebook will: 

* Provide an easy-to-use "master" control that enables users to set who can see the content they share through Facebook. This enables users to choose, with just one click, the overall privacy level they're comfortable with for the content they share on Facebook. Of course, users can still use all of the granular controls we've always offered, if they wish.

* Significantly reduce the amount of information that must be visible to everyone on Facebook. Facebook will no longer require that users' friends and connections are visible to everyone. Only Name, Profile Picture, Networks and Gender must be publicly available. Users can opt to make all other connections private.


*
Make it simple to control whether other applications and websites access any user information. While a majority of our users love Facebook apps and Facebook-enhanced websites, some may prefer not to share their information outside of Facebook. Users can now opt out with just one click.

I encourage you to take a moment to read our CEO Mark Zuckerberg's blog post and check out the new Facebook Privacy Page.

Thanks,
The Facebook Ads Team

Tuesday, April 27, 2010

Searching Facebook Posts Facebook API F8

Got back from the facebook conference last week, and let it settle in a little.
Of all the cool things I learned. Being able to search through and read public posts has to be the most valuable and scary thing that came out of it.
Facebook has now made it very easy to search through all posts made between friends unless the user has specifically set it to private. Now very few people have taken that precaution, but they still believe their posts can only be read by people they have accepted as their friends - well, sorry, but we all have access to your dirty little secrets.
Here is a search of facebook posts for "oops sex" It is easy to see that these people have no clue you are able to read their text and can see their full name right next to it.
http://graph.facebook.com/search?q=opps%20sex&type=post

Then if we want we can pull out a certain name and see everything that they have posted and we can see their friends who they meant to sent that information by using the other API calls here:
http://developers.facebook.com/docs/api

If you don't want others reading your posts you better set your profile to private!
--------------------
Now for Internet marketers like myself this has a lot of good uses.
Using a scraper like mozenda you could ping this search with many terms and learn what people are saying about your brand while building a social graph of this information. You could also find hot sales leads here too, but be careful how you approach them. You could freak people out.

Tuesday, April 20, 2010

photos from f8


Here is my first photo from the 2010 f8
I think I saw sean parker and mark.
More to come from f8

Sunday, March 28, 2010

Web Data Mining

Web data mining is the application of a variety of different techniques in order to discover what kind of information Internet users are looking for. Web data mining may be used to get several different specific kinds of data (such as text only data, or multi-media data), and has a wide variety of potential applications. Data mining has been used almost since the Internet first started and more people are using these kinds of techniques every year. Companies who wish to attract more potential customers can use information gained from data mining in order to better target their advertising campaigns. University students who are doing research on how people use the Internet can use data mining to produce accurate statistics. Even if your sole purpose is to drive traffic and you are not doing any advertising at all, data collected from a data mining operation can give you all the information you need to form effective keywords.

Data mining may also be referred to as "scraping". So how is data mining/scraping accomplished? Software is the most popular method of doing so and there are a number of different companies out there that offer data mining services. Mozenda and Kapow Technologies are two popular companies devoted to data mining. This data can be totally invaluable to commercial advertisers because it gives them the information they need in order to put their product/products in front of as many web users as possible. After all, how can you target ads toward people when you have no idea what they are searching for? How can you start a viral campaign via social media if you do not know how people use the web. Knowing how people use the web is vital to the success of just about any website, commercial or non-commercial.

This is especially true for the little people - small businesses and new websites who are trying to make a name for themselves. Data mining can help large companies as well, but the small company stands to reap much greater benefits. Large companies have already established themselves and will already have a customer base they can count on without the need for advertising. This is almost certainly not true for the small business owner/website operator. So what about the ethical side of all this? It is largely a matter of personal opinion, but virtually everyone agrees that web data mining can be used both ethically and unethically. Many people can agree that as long as data that is not considered personal (sexual preference, religion, political affiliation, etc.) is not harvested then there are no ethical concerns.

So from that point of view, what people in general search for (like popular search trends) and tend to look at is certainly ethical. Unethical from this point of view would mean that a particular company (or companies) harvest data in order to build complex profiles based on personal information in order to promote their product(s). However, even though this point of view is popular does not mean everyone views this issue the same way. Some people believe all data mining is unethical and infringes on privacy. Others believe that data mining is totally acceptable and that people who have nothing sinister to hide have no reason to be concerned. Civil rights organizations are deeply concerned about government use/abuse of data mining technology to spy on it's citizens. This is an interesting issue and there is no telling what the future may hold for data mining. At this point in time, most forms of internet data mining are totally legal and used by a large number of individuals and corporations all around the world.


web data mining

Saturday, March 13, 2010

Screen Scraping for Web Content Extraction

The Internet is one of the largest sources of information. That information is found in the images, text and feeds that are associated with the web site. The problem with a typical web page is that valuable information is displayed in human readable format and it is difficult to process automatically. All the colors, layout, and images are extraneous information that the computer would have to filter through to find and collect the information that it is important. Screen scraping is a way for all that information to be consolidated and processed in a fast and effective manner.

Web pages today display the content and the information to the user through what is known as Hyper Text Markup Language. This complex and often confusing code provides the typical internet browser with information that is important to how the page is displayed to the human readers on the screen. All of this formatting code is not of value to an automated data processor and for business. This is where screen scraping comes in.

The vast number of sites presents a challenge for those trying to collect information. Traditionally this task would require hours of sitting in front of the computer, loading each page onto the screen and performing manual research. While this is often the most detailed and accurate method of online research it is time consuming.

Screen scraping is performed by an automated system of programs that loads the code of a web page and filters through the content, looking for the desired information. Most of the code that is found on the web page is structural or graphical and does not have any real value. Images, font, and color are all parts of this structural code. Found within this are small pieces of key information. This information is taken by the program and stored for analysis and report generation. By automatically filtering through the code required to display a web page, key facts and figures can be lifted and processed in a timely manner.

With such a complex and sensitive task as screen scraping in high demand, it is no surprise that there are several highly rated and professional companies that provide this valuable service. The applications of screen scraping are endless. Corporations, private companies, and individuals can use these services to perform market research or gain precious information on current trends and events. Economic information can be gathered using a screen scraping service from sites that continuously update there site with stock up dates or interest rates. The current weather forecast can be found automatically using screen scraping. Strategic information about market competition can be taken from the internet. The value and information that this service provides is key to businesses and individuals alike.

Mozenda is a company that can provide the valuable services of screen scraping and web data processing. By using advanced programs any web page can be captured and the text and images saved for future use or analysis. Images can be automatically downloaded from the internet. Valuable data can be stored in XML or RSS formats using Mozenda's services. In addition to the information collection process, Mozenda offers market research and intelligence. For companies needing information bu are unable to sift through the countless websites one by one, the services offered by Mozenda can get help companies get the valuable information fast and get back to what they do best.

Kapow Technologies is another group providing companies with screen scraping and data analysis services. By removing the unnecessary code from the web page, the information can be taken and saved for future use. By utilizing advanced logic, the extraneous data is removed leaving on the key parts that corporations, businesses, and individuals need.

Information moves at the speed of light and screen scraping is a way for that information to be gathered in a precise and automatic way. Using the professional services that provide screen scraping is a tool that keeps up with the pace of information and organizes that information into a format that is quickly analyzed and processed.


Screen Scraping

Saturday, March 6, 2010

Screen Scrapers that help collect data

Screen scrappers are useful computer software that helps collect data that are character based from the output displayed by other programs. Screen scrappers are designed to extract and collect specific data, and to present the collected data in a richer display format using tables or graphs. They can also simply collect data to be indexed for storage. Screen scrappers are increasing in popularity and usage and are also referred to by other names such as a content miner, website ripper, automated data collector, web extractor, website scraper and HTML scrapper.

When activated, a screen scraper will search through website codes, filtering out extraneous codes to provide a better looking presentation. A scrapper only looks for useful data, ignoring the other codes that are useful for presenting the original page in its original layout. A web scrapper just collects the data and presents it without all the accessories that come with the original HTML code.

Screen scrappers are used for a number of applications. A popular example of its use can be seen in the way search engine spiders work. Search engine spiders crawl millions of websites and their pages, collecting data and indexing them. When a person conducts a search, the indexed data are presented as search engine results.

A large number of screen scrappers search through the HTML codes of websites to collect data. Some can however search through scripting languages apart from HTML such as PHP and JavaScript. The collected or mined data will then be presented as HTML, which can be accessed using a web browser or can be stored as text to be accessed offline.

Screen scrappers save a lot of time and energy. People no longer need to search for appropriate sites, click through links to search and collect needed data. The web miner will automatically search through websites based on relevant keywords and generate charts, spreadsheets, graphs and other data needed to compare or use in presentations and reports. Screen scrappers can also effectively access information stored on system that can no longer be accessed, because of incompatibility issues caused by new software or hardware.

While screen scrappers are very useful to legitimate businesses and website owners, they can also be used for illegal and unfavorable purposes. Legitimate business, website owners and search engines make good use web miners to provide useful services and to effectively collate needed data quickly and with relatively less effort. However, some individuals, companies and web owners wrongly use screen scrappers to mine and collect email addresses from websites to use for spam advertising.

The wrong use of screen scrappers by some have led to an ongoing argument within the web community about the ethics and legalities involved with using screen scrappers. Some argument also exists over copyright issues as screen saver can copy the hard work of one person from a website, and then present it in another format on another website. Since screen scrappers neglect data such as adverts on the webpage, people who rely on adverts to generate revenue are complaining because their ads get left out. For these reasons, many website owners are taking measures to prevent their website from being scrapped. At the end of the day, even though it is true that some make use of screen scrappers for negative purpose, it remains a very handy tool that can effectively and legitimately save you time and money.

Sunday, February 14, 2010

Internet Data Mining Software

Internet data mining software is used almost universally. Several businesses and government agencies utilize this software for a more efficient method of searching the Internet for the information they need. Law enforcement agencies utilize data mining software to search for criminal activity that goes on over the Internet. These criminal acts can vary and can become construed with legal activity so software that can weed out the bad from the good is very useful to them. To go one step further, homeland security uses data mining software to prevent terrorism. Many individuals unknowingly put information up on the Internet that compromises their goals because they believe it to be a safe place when in fact, it is constantly being scoured by government data mining software that searches for words or phrases related to terrorist activity.

Internet data mining software is also utilized by business owners. Knowing what your competitors are doing with specials and pricing gives the user a huge advantage in their targeted market. Internet data mining software will pick out certain specials that are given by certain business as determined by the software's owner. There is so much information stockpiled on the Internet that looking for specific data can be a lost cause. Even after finding information related to whatever it is that you searched for most of it is probably useless. The use of internet data mining software negates all of these problems making it a very useful tool for anybody who researches extensively.

One of the biggest names in data mining software is Mozenda. Mozenda is a very advanced data mining program. It has programming that enables it to be automated; this means that even while you are away Mozenda can carry out whatever task you gave to it. Mozenda creates virtual agents that can extract and manage data from various sources and can even publish data to multiple destinations at the same time. All of the data extracted through Mozenda is stored in their virtual warehouse where it will be safe and secure until you are ready to access it. Companies will find Mozenda to be very useful because of its ease of integration with a data system that is already in place. Mozenda will enable the company to store their data and then publish it automatically as they wish.

Kapow Technologies is another major player in the internet data mining software industry. Kapow Technologies is geared towards big business. Their software helps businesses to better communicate with their partners. This allows the sharing and publishing of data between partners to be seamless. Kapow Technologies takes pride in providing up to the second web data that is related to the interests of the business. This allows for more informed business decisions.

Internet data mining software will continue to be a huge part of businesses worldwide as long as the internet remains functional. The versatility of the software means that many different types of businesses will utilize it. The fact that there is information regarding everything one could imagine on the internet means that there is probably someone who could use software to extract that data, share it with others, and then publish it again to their advantage. On top of this, data mining software can also show you how your product or service is doing. This is achieved by using the software to mine your own data, and then seeing how relevant that data is compared to other data of the same variety.

Saturday, February 13, 2010

Web Scraping Software

Web scraping software is an innovative tool that makes gathering lots of information relatively easy. The program has numerous implications for anyone who has the need to search for comparable information from various locations and put it into usable context. This method of finding extensive information in a short period of time is cost effective. Applications are used everyday for business, medicine, meteorology, government, and law enforcement.

The software is user friendly and can be operated by anyone from non-tech data collectors to experienced Web designers. Programs are available for purchase in stores or online. Certain online sites, such as Visual Web Ripper, provide video guides to go along with their products. Tutorials on how to create a project, navigating the software, and extracting data are provided. There are websites that allow users to establish an account, start a monthly membership and access software that way. There are websites tutorials providing step by step instructions to users which demonstrate how to make their own screen scraper using various programs.

A user enters the software and begins by programming an “agent”, this is the tool that will retrieve any and all information. A desired list is made of pertinent information pertaining to the search. Parameters are set as to how extensive the search must be and in what locations. A user has full control over which pages are crawled and how elements are mapped. Visual and/or textual information may be retrieved. Web pages or hard disk files can be searched. Once the information is obtained, it can be saved as CSV, TSV, XML, or RSS, spreadsheets or databases. The data can be formatted into chunks, such as names and addresses, for easy retrieval. After data is received it can be analyzed. Agents start gathering data with the click of the run button or when programmed to start at specific scheduled times. The program can also alert users to changes it encounters within scheduled search fields.

Web scraping software provides customer information, marketing information, and competitor information. Businesses develop a closer relationship with their customers by discovering what products are selling, what product defects have been encountered, what consumers like or dislike about a product, or what particular group of customers favor a product. The software directs companies as to which decisions to make as it analyzes how they stand in relation to their competitors or they gain knowledge of current or upcoming trends. Price comparisons, buying and selling trends, and consumer logistics are all data options that can be gathered, stored, analyzed and implemented into profitable business platforms.

There have been legal ramifications as some have complained about intrusion and copyright infringement. Legal boundaries and guidelines may become established in the future. Some have implemented software to prohibit or block webcrawlers.

Visual Web Ripper, Irobot, and Happy Harvester are a few of the top rated Web scraper programs currently available. There are many others on the market. This tool is sure to gain popularity as it makes data research and monitoring quick and effective.



Screen Scraper

Sunday, January 24, 2010

The Advantages of Data Scraping

Unlike traditional search engines, which compile web sites based on popularity but provide no mechanism for extracting information, new software from companies such as Mozenda www.mozenda.com, Screen Scraper www.screen-scraper.com, and Fetch www.fetch.com utilize a process called data scraping to gather relevant information from any website. Data scraping allows an organization to quickly and efficiently gather the needed data in an accessible format, without the bother of cutting and pasting. A user can have instant access to email addresses, competitor pricing, supplier information, and other specific data, arranged in a single display page that can double as a document.


Data Scraping Tools combines the broad collective power of a traditional search engine with the refined nuance of a personal file clerk. By pulling the needed information, based on the specific criteria entered by the user, the software can greatly simplify the process of assembling reports, comparing prices, and performing other analysis. Plus, Mozenda has an easy to use graphic interface, which allows anyone to use familiar browser features to master the software’s functionality instantly.

Data scraping culls incredibly valuable information that would otherwise be nearly impossible to acquire using traditional search engines and copy paste methodologies. While search engines tend to focus on displaying web site addresses, Mozenda goes into the website itself to extract the information you need. To demonstrate this specificity, Mozenda is able to recognize types of data formats like addresses, prices, and emails, and then capture those results cleanly into a CSV, XLS, or XML file.

Data scraping also allows you to quickly identify other specific information such as changes on web pages, new data values, and general updates to specific data sets. For example, if you’re scraping information from your suppliers catalog, Mozenda will not only compile recent products and prices, but also automatically update new products and price changes that the supplier adds to the catalog. By using data scraping, you can begin acquiring information that you can utilize in mere minutes instead of spending days or even weeks attempting to dig up the same information with traditional copy and paste.

In the case of Mozenda, their highly efficient data scraping software has won the company several high-profile clients, many among the Fortune 100 elite. These companies recognize that if the internet can be thought of as a giant database, then data scraping software is the ultimate database mining tool. In an age where every second counts against the competition, data scraping is the quickest route to getting critical information from the web in a useable format.

Additionally, for companies that compile detailed text-based reports, the time consuming chore of research is abbreviated by employing data scraping. By virtue of refinement and highly advanced programming, data scrapers can procure large amounts of information in readily useable formats at incredible speed. Individual user data, which previously had to be compiled by hand from a multitude of sources, becomes so simple and efficient to procure that any company can instantly make adjustments to implement more automated processes.

As the number of companies using data scraping technology grows, the marketplace is becoming increasingly competitive and proficient at finding new ways to seek out information. The software is constantly being refined and developed to match the needs of its users. Because the internet is such a rapidly changing medium, and the amount of information is always increasing, the need for an automated method of finding, acquiring, and utilizing this information is more important than ever. Data Scraping is one way to do this and provides a huge advantage to the companies that have figured it out.

Tuesday, January 12, 2010

Capture Website Data

Capture Data from websites can be easily captured and archived even after the original site has been rendered out of service. This can be done using a number of useful API that is, Application Program Interface. These are libraries that hold code tags that can be used to prompt the websites’ data base to allow you to retrieve the data with which you have interest. This may include email messages, blog feeds, office documents and news site posts.
The pages stored can also have keywords included for easy finding in future. Thus if the information gets obscured online, the document can still be retrieved without hustle. Simple control structures ease the work of information retrieval.
Web Scraper

Sunday, January 3, 2010

Website Scraper Uses

Website scrapers can be used in a variety of ways for a variety of purposes. The use of website scrapers range from the useful, such as the collecting and displaying of a wide range of knowledge from many different fields for the purpose of education to the use of website scrapers to generate traffic for a site full of stolen content for the purpose of advertisement revenue. The use of a website scraper is up to the individual who uses it rather for benefit or personal gain. A website scraper is like every bit of technology. The ultimate good or bad is up to the user.