My approach is intended to make visible the content that you already have so search engine users who are interested in that content can find you -- and to accomplish that without affecting your current, branded site design.
First, let's take a closer look at the typical problems -- what kind of page designs get in the way.
Then let's look at the faulty solutions typically proposed by search optimization companies -- such as metatags and doorway pages.
Then let's look at the best and simplest solution.
Unfortunately, branding rules and politics may prevent you from solving the problem the simple way.
But there are workarounds that allow you to leave your corporate design as is and make your pages more visible to search engines without violating search engine rules.
Many sites present dynamic pages -- generating pages on-the-fly from databases. Such pages typically have a ? in the URL, which serves as a stop sign for search engine crawlers. These crawlers need to avoid being trapped at dynamic sites, which could generate huge number of pages, clogging search engine indexes with useless content.
Other sites use javascript in such a way that very little text is visible to search engine crawlers. Even the links to other pages at your site may be buried, so that if a crawler finds one page at your site, it can't follow a trail of links to discover the rest of your site.
Other sites use frames or tables, which while not blocking crawlers, wind up confusing search engine users. For instance, when a frames page is indexed, each window is indexed separately, so someone finding a match and clicking on it will be presented with that window alone, out of context. And when pages dependent on tables are indexed, the words are interpreted as appearing in sequential order, left to right, instead of associated by columns and rows. As a result, phrases get jumbled.
When companies realize that their pages are poorly represented or not at all represented in search engine indexes, they typically target the symptoms rather than the cause of the problem. They'll set measurable goals that are irrelevant to their true business needs -- such as "ranking" for specific key words -- and then hire experts who try to trick search engines into delivering those the desired results. In the process, the "experts" may break the rules that search engines have set up to try to keep their indexes truly useful, and thereby get the company's pages completely thrown out. In any case, the experts typically have their hands tied -- they are unable to add new useful content to the site and are unable to change the site's basic design. Hence they often propose adding metatags to existing poorly designed pages and/or creating doorway pages.
There are two kinds of metatags that matter to search engines.
The description metatag indicates the text that should appear
in a search engine results list. The default is the first
couple lines of text. If that text is nonsense -- random words
that happen to be associated with graphics -- or if there is
no plain static text at all, because of the way the pages are
generated -- a description metatag might be useful, but it's
just a band aid for a problem that you created for yourself
with your page design.
Keep in mind that the two most important parts of a Web page for purposes of search engine ranking are the HTML title and the first couple lines of text. Yes, the description metatag can give you a coherent description in results lists, instead of gibberish; but the first lines of text still retain their priority for ranking. In general, search engine ranking algorithms pay little of no attention to metatags, mainly because they are very vulnerable to "spam". Search engine companies want to ensure as much as possible that what actually appears on the page is a good match for their users' queries; and they trust the actual static words on the page far more than anything that might appear in metatags. So a page that has no description metatag and has a good clear description in the first couple lines of text is likely to do better in ranking than a page with those very same words presented in a description metatag and little or no static text on the page itself.
The second kind of metatag is called "keyword", a term that confuses many people, leading them to think in terms of databases. Many presume that search engines index keywords, and hence that the keyword metatag is very important. In fact, only two of the major search engines, Inktomi and AltaVista, pay any attention at all to keyword metatags. And AltaVista gives plain text on a Web page higher priority than anything in keyword metatags, for purposes of ranking.
Today's search engines in fact index every word on every page. And many search engine users enter multi-word queries. So the more useful text you have on your pages, the more likely it is that searchers who want your kind of information will find you. Instead of spending time and money generating key word metatags, you should add more and more useful content to your site.
To get a concrete feel for how this works, try a high end web traffic analysis program, like WebTrends Log Analyzer. (You can download it for a 30-day free trial). Such a program will let you see how people come to your pages -- showing not only the volume of traffic from each of the major search engines, but also the queries that people used to find you. You are likely to be amazed at the variety and the detail of the queries. Very few people enter single word queries -- and those who do are not likely to be good prospects.
To get a sense of the typical behavior
of search engine users, check stats generated for my site
using WebTrends on one particular day in January. Go
to www.seltzerbooks.com/jan25/jan25.htm [no longer online] In the
left column, click on "Referrers and Keywords". Then click on
"Top Search Engines", "Top Search Phrases", and
"Top Key Words". The key words are relatively useless, while
the phrases are rich and informative.
Keywords only matter for advertising -- search engines will sell you ad space on pages generated when certain words appear in the query. But for actual searching, keywords are meaningless. Put your effort into generating more good content.
Also don't waste your time with key-word position checkers -- programs, like WebPosition Gold that tell you how your whole site or particular pages rank for particular queries. Such programs bang away repeatedly and automatically at search engines, adding an enormous load to those systems and hence slowing response time for actual users and forcing the search engines to invest more to keep performance at acceptable levels. While the results you get from such programs might make you look good to your boss, they mean little or nothing in terms of how much traffic your pages are likely to get by way of search engines. For traffic, you need content, and lots of it.
Some search engine optimization companies will propose creating "doorway" pages for your site.
A doorway page consists of artificially
generated content, designed to emphasize pre-selected
keywords. The page itself is meaningless. Its sole
purpose is to fool search engines, with the objective of
coming near the top of lists of matches when queries include
those key words. In fact,
search engines strongly discourage that practice, and when
they detect it, many will blacklist those pages and sometimes
even all pages from the same
site -- kicking them out of the index.
You might think, "Why worry? There's more than a billion pages out there. How will anyone ever know?" As was evident in presentation after presentation at a recent search engine conference in Boston, your competitors and the people hired by your competitors to design their sites watch what you do very carefully. They understand these techniques and how to spot them. And when they spot them, they blow the whistle -- loud. Search engine folks are inundated with alert messages about wrong doing, sent by competitors of the companies that are using those techniques.
Also, doorway pages typically reside on
the servers of the "search optimization" companies that
produce them. They automatically
redirect traffic to actual pages at the customer's site. That
means that the customer receives more traffic, but to the
search engine the traffic appears
as if it is going to the optimization company. So when the
search engine calculates the "popularity" of pages for
purposes of ranking, the customer
site gets none of that credit.
In other words, many people go to great lengths and great expense to try to fool search engines. But the risks are great -- not just annoyed search engine users, but your sites becoming blacklisted.
It is far less expensive and far more
effective to write good, useful content and present it simply
so it can be properly indexed.
Since search engines index every word on every page they find, the more useful text you have at your site, the more likely your pages will be found by people who are interested in them. This is a random game -- the more content you have, the more dice you throw, so the more likely you'll win.
Focus on building content, not on trying to trick search engines.
Large pages are more valuable than short ones -- large in terms of text, not graphics. Graphics are useless for search engines.
For maximum effectiveness, you need plain static HTML pages (not ASP pages), without frames or tables or java applets.
The most important text should appear at the top of the page. In fact, the first couple of lines of text should make sense as a description of the page.
And the HTML title (not the file name, and not the headline that appears on the page, but rather the title in the HTML header) should be carefully written to mention everything that is important about the page, in very few words. That title will appear in search engine results lists as the words that are linked to your page. And words that appear in HTML titles are typically given very high priority by search engines -- in other words if two pages match a given query and one of those pages has the query words in the HTML title, that is the page that will appear on top.
These static pages can have static
graphic images (jpg or gif), if you like. Such images will not
help you with search engines, but won't hurt you either -- so
long as none of the text is embedded in images.
Even at Digital Equipment, the company that invented the pioneering search engine AltaVista, corporate branding rules got in the way of making the company's pages findable by search engines.
Typically, the branding folks come up with a "template" based on what they have done in the past with their print brochures and without realizing they are costing the company traffic and hence revenue with such rules.
Branding rules, which are intended to present a consistent corporate visual image to the world, were often created before the Web or by people who had no real knowledge of how the Web works. Once in place, such rules can make it very difficult to do what is necessary to use content to attract traffic to a Web site by way of search engines. And bureaucracy often makes it very difficult to change such rules once they are in place.
Basically, there are two parallel design goals when building a Web site.
1) Design for the optimum user experience at the site, with a consistent look-and-feel that follows corporate branding standards.
2) Design for optimum traffic and revenue growth by making your content clearly visible to users of the major search engines.
Far too often, large companies put all their emphasis on the user experience at the site and forget the search engines which are necessary for getting people to the site in the first place.
Only people who find your site can appreciate the experience.
You need to take steps to draw more people to the site. But the design that optimizes the user experience will probably get in the way of search engines, and hence cost your traffic and business.
If you have no chance of converting the branding and marketing folks from their current search-engine unfriendly site design, consider creating mirror pages.
This approach keeps your current branded design in place, but you can still take full advantage of your text content to attract more visitors and hence generate more business.
Brand plays an important role on the Web -- for existing customers and partners and for users who have found your site and want to navigate through it for further information.
But to attract new prospects using search engines, you need to take a different and parallel approach that emphasizes text.
Your mirror pages have the exact same text content as your standard pages, but they are presented in static HTML, with meaningful HTML titles and with the first couple lines of text written so they can serve as a description. They have no metatags. They have a minimum of graphics.
If the standard pages are very tiny -- forcing the visitor to look at a whole series of pages to read or print what is really a single document -- the mirror pages will consolidate that text (serving at the same time a "printer friendly" version).
Each of these pages has links to your main site, with an explanation that people should go there for the optimum experience -- with all the graphics and dynamic effects. But people who were looking for specific information and found this mirror page by way of a search engine will be quite happy because you have provided those people with exactly what they were looking for. There are no links from mirror pages to other mirror pages, and no links from standard pages to mirror pages.
Static HTML pages attract traffic by way of search engines and provide new visitors with the information they want, very efficiently.
Also create a sitemap page that consists of a hyperlinked list of all your mirror pages. You submit that sitemap page, not your home page to the search engines.
Search engine crawlers follow a trail of links to "discover" the content on the Web. If your pages today are not well represented in search engines, chances are good that you are using a technique that either halts crawlers (e.g., dynamic pages with question marks in the URLs) or that present links to other pages at your site using a technique that makes those links invisible to crawlers, such as dropdown menus. By sending a crawler to a static HTML sitemap with links to every one of the search-engine-friendly pages at your site, you make it easy for that crawler to find all of your content.
Also, search engine crawlers typically
stop after going several layers deep into a site (following
2-3 links in succession). (Last I heard, Lycos only
went one layer deep). If you have a sitemap with links to
every mirror page, and if you submit that page, instead of
your home page, to the major search
engines, you make it so every page at your site is just one
layer deep.
To put it another way, some schools prepare their students to pass a particular test; while others prepare them for success in the real world. Don't mistake the measurement for the reality. Don't be impressed by the "rankings" of competitors or by the "rankings" that search engine optimization companies claim they can deliver. Focus on your real business objectives. Don't try deceptive tricks. They backfire. Provide real content in a form that can be well-indexed and that is also immediately useful to your customers and prospects.
Your mirror pages have text and links to encourage visitors to go to your standard site. If those standard pages provide real benefit and you explain that benefit clearly, visitors should head there. Do not use automatic redirects and other cloaking mechanisms that take visitors to a page they didn't ask for That's "spam." Treat your visitors with respect, and abide by the sensible rules of search engines.
People arriving at mirror pages from search engines immediately see what they were looking for. Those arriving at doorway pages are automatically redirected to other pages where they typically do not find what they are looking for because the exact words of the query do not appear on the destination page.
You want to target search engine users
who are actually looking for the content on your pages. The
more pages you have with the full-text indexed,
the better your chances that those people with unique queries
will find your pages.
Also, mirror pages reside on your site. Any traffic directed there by search engines counts toward the popularity of your pages and site when those search engines determine which pages should appear high on lists of matches.
Check www.jeremyjosephs.com That is the site of freelance writer, Jeremy Josephs, in Montpelier, France. The standard site looks professional, but got zero traffic.
Then check www.jeremyjosephs.com/sitemap.html That will take you to his sitemap, with links to mirror pages for his site. Check a few of the links to get to plain-text search-engine friendly versions of his articles and books. By using this technique, he now gets over 1000 visitors/week.
Remember that your goal is not "maximum positioning" for your pages with regard to particular keywords. Your goal is not just getting more pages into search engine indexes. It's a question of traffic and new business.
Also, don't expect immediate results. It may take months before your new mirror pages get into the search engine indexes. Search engines vary as to how frequently they crawl and update their indexes, ranging from about a week with AltaVista, to 6-8 weeks with many, and up to 3 months with some. And if you trigger spam alarms with doorway pages and other tricks, it could take years to untangle the mess.
There are no links from your full-blown pages to the mirror pages -- only links from the mirror pages back. Hence only people arriving by way of search engines or by bookmarks/favorites are likely to ever find these pages. But since the pages reside on your servers and provide your content directly, issues of branding could still arise. These pages have a text-heavy look and undoubtedly will not conform to your corporate branding rules.
You need to explain that these pages are not part of your site, rather they are an add-on, intended to draw traffic, and that the corporate rules should not apply here.
Think of search engine traffic like word-of-mouth referrals to your company. You brand your brochures and your advertising. But you wouldn't consider branding word-of-mouth marketing.
Instead of asking a friend, a Web user turns to a search engine and asks -- "where can I get information about xyz and pdq?" The search engine then provides a plain-text results list -- that includes no one's logos, no one's branding. And when users click on links in such lists, they expect to go straight to pages that have the kind of information they are looking for. This is very different from calling up a sales person and asking to receive a brochure or a spec sheet.
Explain the difference between
optimizing the user experience and optimizing search engine
results. Try to find ways to work together to generate new
business for the company.
What if you can't convince management to allow you to create mirror pages? Your pages are all dynamic with a ? in the URL, so crawlers come to a halt whenever they come to one of your pages, not following any trail of links. And your pages have no useful text is visible to search engine crawlers. Are you totally dead in the water? Not necessarily.
In this worst case scenario,
1) Write useful informative HTML titles for each and every
page (every one different)
2) Write description and keyword Metatags for each and every
page (every one different, and without much repetition among
HTML title, description Metatag, and keyword metatag).
3) Create a sitemap page -- a static HTML page that ignores
branding rules: just a list of the HTML titles of all your
pages with links (including the ?)
4) If branding rules prevent you from posting this sitemap
page on your company's site, then post it anywhere else --
even on free Web hosting space.
5) Submit this sitemap page to the major search engines. They
should eventually check every page linked to from that sitemap
page. They won't be able to follow the links any further (the
? stopping them), but they should be able to capture the HTML
title and, in some cases, the metatags -- so you could still
end up with some information about all of your pages in the
major search engine indexes.
6) Each HTML title and metatag should be no longer than 255
characters
But don't lose sight of the fact that if you are forced to proceed this way to help your company attract new prospects through search engines, your company has serious management problems. Keep kicking and screaming in hopes that eventually your message will be heard and the corporate rules change. And meanwhile, be sure to update your resume and post it on the Web.
I took a quick look at the site, and immediately saw that it was over-designed -- that the page design was getting in the way of search engines ever seeing and indexing the content.
Content can drive traffic to a Web site
by way of search engines. But many well-established and
expensive sites are designed in such a way that search engines
cannot see the text on their pages. In such cases, simply
registering pages with search engines
accomplishes nothing.
Many sites present dynamic pages --
generating pages on-the-fly from databases. Such pages
typically have a ? in the URL,
which serves as a stop sign for search engine crawlers. These
crawlers need to avoid being trapped at dynamic sites, which
could generate huge number of pages, clogging search engine
indexes with useless content.
Other sites use javascript in such a
way that very little text is visible to search engine
crawlers. Even the links to other pages at
your site may be buried, so that if a crawler finds one page
at your site, it can't follow a trail of links to discover the
rest of your
site.
Other sites use frames or tables, which
while not blocking crawlers, wind up confusing search engine
users. For instance, when
a frames page is indexed, each window is indexed separately,
so someone finding a match and clicking on it will be
presented
with that window alone, out of context. And when pages
dependent on tables are indexed, the words are interpreted as
appearing in sequential order, left to right, instead of
associated by columns and rows. As a result, phrases get
jumbled.
When companies realize that their pages
are poorly represented or not at all represented in search
engine indexes, they typically
target the symptoms rather than the cause of the
problem. They'll set measurable goals that are
irrelevant to their true business
needs -- such as "ranking" for specific key words -- and then
hire experts who try to trick search engines into delivering
those
the desired results. In the process, the "experts" may
break the rules that search engines have set up to try to keep
their
indexes truly useful, and thereby get the company's pages
completely thrown out. In any case, the experts typically have
their
hands tied -- they are unable to add new useful content to the
site and are unable to change the site's basic design.
Today's search engines index every word
on every page. And many search engine users enter multi-word
queries. So
the more useful text you have on your pages, the more likely
it is that searchers who want your kind of information will
find you.
Instead of spending time and money generating key word
metatags, you should add more and more useful content to your
site.
Keywords only matter for advertising --
search engines will sell you ad space on pages generated when
certain words appear
in the query. But for actual searching, keywords are
meaningless. Put your effort into generating more good
content.
Also don't waste your time with
key-word position checkers -- programs, like WebPosition Gold
that tell you how your whole
site or particular pages rank for particular queries. Such
programs bang away repeatedly and automatically at search
engines,
adding an enormous load to those systems and hence slowing
response time for actual users and forcing the search engines
to
invest more to keep performance at acceptable levels. While
the results you get from such programs might make you look
good to your boss, they mean little or nothing in terms of how
much traffic your pages are likely to get by way of search
engines. For traffic, you need content, and lots of it.
Since search engines index every word
on every page they find, the more useful text you have at your
site, the more likely your
pages will be found by people who are interested in them. This
is a random game -- the more content you have, the more dice
you throw, so the more likely you'll win.
Focus on building content, not on trying to trick search engines.
Large pages are more valuable than short ones -- large in terms of text, not graphics. Graphics are useless for search engines.
For maximum effectiveness, you need plain static HTML pages (not ASP pages), without frames or tables or java applets.
The most important text should appear
at the top of the page. In fact, the first couple of lines of
text should make sense as a
description of the page.
And the HTML title (not the file name,
and not the headline that appears on the page, but rather the
title in the HTML header)
should be carefully written to mention everything that is
important about the page, in very few words. That title will
appear in
search engine results lists as the words that are linked to
your page. And words that appear in HTML titles are typically
given
very high priority by search engines -- in other words if two
pages match a given query and one of those pages has the query
words in the HTML title, that is the page that will appear on
top.
These static pages can have static
graphic images (jpg or gif), if you like. Such images will not
help you with search engines, but
won't hurt you either -- so long as none of the text is
embedded in images.
At one point in the movie, Jennifer's son, age ten, has a question that she can't answer, and she replies matter-of-factly, "Google it."
They are walking up a New York City street. The mother, an underprivileged Latina who works as a maid in Manhattan, has lived within a four-block radius in the Bronx all her life. Presumably, she never went to college. The theme of the movie contrasts her life style and that of the very privileged future US senator (son of a US senator) who falls in love with her. It's a modern Cinderella story, reminiscent of Flash Dance, except in Flash Dance the Cinderella had talent and had to prove she had talent to achieve her dream. In this case, Jennifer Lopez just has to be gorgeous.
But this stereotypically underprivileged person has seen the Web and Google, or used them or heard about them so often that she takes their capabilities for granted, as a normal part of her life and her son's life.
When she uses the expression "google it", no other explanation is required either for her son or for the movie-going audience. There's no mention of the Internet or search -- all of that is implied in the newly-coined verb "google". No big deal. To me the fact that that is not a big deal to characters of this kind in this kind of a movie is a very big deal indeed. To me that signifies that the Web has gone completely mainstream, that it is not just high-tech that we read about and hear about everywhere, but rather is an ordinary expected part of our daily lives, that we depend on it and take it for granted like refrigerators and stoves and microwaves and televisions.
The ten-year-old kid is shown to be bright when by chance he happens to be in an elevator with a New York state assemblyman who wants to run for the US Senate. The kid knows the candidate's voting record on environmental issues and makes some intelligent observations. The assemblyman's overzealous idiot assistant is shocked that the kid knows so much. But the assemblyman and the kid both take it for granted that all that info is readily accessible by all on the Web. Smart people know that and use that capability, regardless of their wealth, education, or background. Only fools don't.
To me this movie marks a stage in Internet history, somewhat like the cartoon in the New Yorker, back in July 1993, that showed two dogs looking at a computer monitor, and the one dog said to the other, "On the Internet, no one knows that you're a dog." That was then followed by the first appearances of URLs and email addresses on billboards and in radio and TV ads; and the first intelligent use of the Internet as a plot element in a high-tech popular movie with Sandra Bullock in "The Net," then the first use of the Internet as a plot device in a romantic comedy with Tom Hanks and Meg Ryan in "You've Got Mail." But in both those movies, and their imitators, the writers and directors felt it necessary to explain the technology. The people who used the technology were early adopters, a cut above the ordinary. You never see a computer in "Maid in Manhattan" (or at least I didn't notice one). In the other movies, computers were everywhere, and, in all probability, computer manufacturers paid the studios to have their equipment prominent displayed, as has been the case since the early days of PCs. Now it's a different ballgame. Computers and the Internet aren't just a part of the everyday office environment, the Internet and Google in particular have become an ordinary part of the English language -- not just how we do business, but how we think, how we deal with our children and with the complex world we live in.
Google, like Frigidair, scotch tape, post-it, and Xerox has become so pervasive, so well known, that the brand name is used as an ordinary word. The brand name has become so successful that the trademark is at risk. The other words reached that status in large part because of massive advertising campaigns. I've never seen an ad for Google.
The other brands became household words because those products were the first of their kind, or at least the first to be widely used. But Google wasn't the first Internet search engine -- far from it. Infoseek and Excite and Lycos were on the scene much earlier. Then they were pushed to the side by AltaVista, which dominated for a while with the advanced hardware and financial backing and the national and international advertising of Digital and then Compaq. But with that backing also came enormous corporate inertia and old economy thinking that held AltaVista back and dragged it in unnatural directions.
Google, which came on the scene late and started as a university research project with little funding, focused on search and just search, and continued to do so over the years, without being seduced into trying to become a general-purpose portal with fancy graphics and dozens of different applications. It grew by word of mouth, not by advertising; by providing an excellent, unbiased, all-inclusive, easy-to-use Web search service, not by making claims on television. And it is now so dominantly popular that not Lycos, not Excite, not AltaVista, all of which advertised heavily and loaded their home pages with flash and irrelevancies, but rather the late-comer Google, with its simple and direct approach has become the household word, synonymous with Internet search, almost synonymous with the Web itself.
And their tradition continues. Shortly before the Christmas shopping season, Google launched a new Web site called "Froogle", which focuses on product search for shoppers. Instead of adding this service to the Google site and cluttering it, taking it away from its core strength, they made a separate site. They didn't even add an ad or even a link from the Google site -- keeping that site as clean and simple as it was before. Over the last few weeks, I've seen no ads for the Froogle site either -- whether in print or on the Web. Rather they relied on providing a quality free service and backing it with good public relations. As a result, over the last few weeks, I've seen dozens of mentions of Froogle in on-line and print magazines, and have received email from a dozen friends suggesting that I give it a try. Word of mouth based on quality, not advertising, wins in the new business environment.Napster created a marketplace, trained tens of millions of people in use of P2P applications, and got them addicted to seeking new music online. The death of Napster cleared the way for new companies to fill that role, companies with lots of incentive to come up with alternative ways to do the same thing without central control, without any way for anyone to track what was going on, beyond the legal reach of the music industry. These Napster wannabes have the additional advantage that today the typical new computer system has far more disk space and that many more people have high speed Internet access through cable and DSL.
One of the most successful of the Napster wannabes is Kazaa. .
To join that community, you go to http://www.kazaa.com and download their software. On installation, that software sets up a "shared" directory on your PC. You can move whatever files you want to that directory, and any music that you download from Kazaa, as a default, goes to that directory. When you do a search for files, your search is propagated through computers running the Kazaa software and you get back a list of matches that appear in those shared directories. As a default, your computer also becomes a Kazaa "supernode," so when new users sign up, randomly, some of them will fetch the Kazaa software from your computer. Also, as a default, anyone can upload anything from your shared directory at any time, without warning, and without leaving a trace that they were ever there.
The Kazaa software includes a "theater" which makes it easy to sample the files you are interested in before downloading them, and also to play the ones that you already have.
The fact that high speed cable and DSL access keeps you continuously connected to the Internet, and that many people who are connected that way leave their computers on lots of the time if not always, makes this service far more efficient and effective that something similar would have been a few years ago, with intermittent, slow dial-up access.
And the combination of high speed access and gigabytes of unused disk space means that it is easy to fetch not just music, but also far larger video files. The typical files my son Tim gets are well over 100 Megabytes, and consist of combinations of music and video adapted and cleverly edited from his favorite TV shows, such as Gundam Wing and Dragonball Z. And some of the files available consist of complete episodes of those animated TV shows.
Sounds great, feels great. And this is all "free". But beware. On the Internet, you often pay a price for free services, a price that isn't measured in money paid.
If you want to venture into this untamed realm, you should take a very close look at the default settings. First, keep in mind that while Kazaa can provide some level of virus filtering, the default setting turns that off. Be sure to turn it on; and also make sure that you have good up-to-date anti-virus software running on your PC.
Also, keep in mind that you do not have to make your PC a supernode and stock your shared directory with interesting files, and allow other people to upload from your shared directory. That is not a condition for participating in the community. In fact, if you wish, you can shut your computer off from access by other Kazaa members, either all the time, or when you are doing serious work on your computer, and don't want your system resources to be randomly taken over by strangers as they upload your files.
Also, while with Napster you could select a dozen songs and let Napster fetch them as quickly as possible while you did other things; at Kazaa, even with high speed access, it often takes quite a while to get the tunes and videos that you want, one at a time; and sometimes, but rarely, after repeated tries, you still don't get what you want.
Also, whenever you use Kazaa, your eyeballs will be bombarded with annoying animated popup ads, often cleverly disguised to fool you into clicking on them.
And use of Kazaa drains your computer's short term memory; so no matter how much RAM you have (and we have 256 Megabytes), sooner or later you won't be able to see animations and won't be able to play music, and if Kazaa is running in the background the other applications you are running will slow down or even freeze, and you have to reboot.
But even with these drawbacks, the challenge of the hunt is often as satisfying as enjoying the music itself. I suspect that even people who have enough money so it means nothing to them to buy music, would prefer to hunt it down this way, for that extra jolt of excitement that comes from playing this wild file-fetching game.Back in 1995, in the early days of the Web, I delighted in the ability to put an entire book in a single document. With plain-vanilla static HTML and no graphics, very large, textually rich pages loaded quickly. And with internal links it was easy to have a detailed table of contents at the top of a large "page" from which you could click to any chapter, and footnote numbers in the text linked to the list of footnotes at the end, and links from there back to where you were before in the text. But then along came search engines.
Thanks to the search engines, people could now find your pages even though you did nothing to advertise/market them. That was a very valuable and important free service. So we began to pay close attention to how they worked and how they ranked the pages they indexed, because changes in the ways we designed our pages could make a big difference in traffic.
When I was researching my book The AltaVista Search Engine, back in 1996, when I was at Digital, I was surprised and annoyed to discover that AltaVista only indexed the first 100K of the Web pages it found. It would pick up links from the rest of the text, but only the beginning of a large page was fully indexed. On the one hand, AltaVista gave extra value to large, text-heavy pages, as opposed to short ones that consisted of just a few sentences and graphics and links. But on the other hand, they set this arbitrary limit. Knowing that, I went back and broke up my biggest, most useful pages into series of shorter ones -- watching that 100K limit. So instead of presenting books as single documents, I broke them into chapters. The result was less useful to my visitors, but the difference in terms of search-engine-generated traffic was important. Later, without making this information public, AltaVista changed that limit to 67K, so people, like myself, who had redesigned pages to comply with their arbitrary limit were unintentionally penalized.
On the plus side -- at least from my perspective -- the fact that search engines indexed every word on every page meant that you could create Web pages designed to be found by particular people or particular sets of people -- as a research tool or a marketing ploy. All you had to do was include the people's names (if they were unique) and the names organizations and activities that you knew they were interested in and likely to search for. I started writing articles about that technique, which I call "flypaper", back in 1996 ; and I recently wrote another article telling about the amazing results that flypaper brought in my research into the Sergei Solovieff mystery: a variant of the Spanish Prisoner Scam
Also, back in the early days of the Web, "anchors" were insignificant. The anchor is the set of words that is highlighted in an HTML page to indicate that it is associated with a hyperlink. Click on the anchor words, and you go to another page. Many sites would use something innocuous and meaningless like "click here" for all their links. I preferred to write out the complete URL and have that as the anchor, so visitors could know where they were going when they clicked, and also so they might remember the address itself for future reference (and for that reason, I kept my URLs as short and simple and logical as possible). Then along came Google, which treated anchors in an unusual way. Like other search engines, Google filled its index by sending out crawlers which followed trails of links. But Google also paid special attention to anchor text, and "remembered" the anchor text associated with particular links, and used that text in its ranking algorithm. They even added to their index pages that they had not yet crawled to, but only knew of indirectly through the anchors that linked to them. So if hundreds of different Web sites all had anchors that read "tyrannosaurus rex" and all those anchors linked to the same page, that page would likely come up very high on a search for "tyrannosaurus rex" regardless of what was on that page or whether Google had ever visited it. Over time, Google covered more and more pages directly and added an algorithm for estimating the relevance of a page to the text in the anchors pointing to it. But still they include many pages in their index that they know of only indirectly by anchors. And because of this practice of theirs, my practice of using the URL itself as anchor text means that my links to useful and important pages at other sites probably give those sites less benefit than if I used catchy phrases as anchors instead.
Worst of all, in terms of distortion of Web site design and encouragement of non-productive practices and business models, has been the concept of "popularity." Google and other search engines decided to define "popularity" in terms of links to a page from pages at other sites and decided to give that "popularity" lots of weight for ranking in search engine results. Before, links to your pages from other sites were helpful because visitors might click on them, giving you extra traffic. Now such links were even more important -- not for the clicks, but rather for the unwarranted interpretation that the search engines gave to them. Pages with lots of links to them got high ranking on searches, and hence got lots of traffic -- even if no one ever clicked on those links.
This mechanism gave a boost to businesses that helped mediate link exchanges among sites that had never heard of one another and might have nothing at all to do with one another. It also led to the creation of many Web pages that consisted of nothing but links -- useless links to other sites with content that had nothing in common with the linking site. Eventually, search engines like Google caught on to this practice (which they had unwittingly encouraged) and figured out how to estimate the relevance of links. Hence, those links no longer provide the traffic boost they used to; but those useless pages and link exchange programs linger on.
The value of link-based "popularity" also meant that if you were going to create a set of sites -- either by yourself or by recruiting others to run them for you -- you would be better off buying a separate domain name for each, rather than running each in a separate directory of the same site or a separate sub-domain of the same domain. If you used separate domain names, Google and other search engines using a similar algorithm, would interpret the numerous links from one of your sites to another as if those were independent sites, and hence would give you a big boost in the rankings for "popularity". For instance, Webseed built a business with a couple thousand volunteer-run Web sites, each with its own domain name. The links among these sites made these sites very "popular" by Google's algorithm, which led to substantial traffic, and (in the days when banner advertising was viable) helped generate revenue. From Webseed's final messages to its volunteer Webmasters (of which I was one), I gather that, eventually, Google caught on, and figured out how to discount incestuous linking (at least in the case of Webseed); hence, Webseed's traffic dropped precipitously, and their business model collapsed.
So why should we care? Millions of Web sites and Web-based businesses are dependent on one another. What one business does in pursuing its own best interests can affect other businesses in unintended ways. For instance, a company sending out a spam email with a subject line intended to fool people into opening it immediately trains the recipients to doubt any future message sent with a similar subject line. The more spam messages sent, the more words and phrases become "tainted", limiting more and more the vocabulary available for legitimate communication. We're all drinking from the same waterhole, and when one person pollutes it, we all suffer.
The people who make the rules and formulate the algorithms for the major search engines should take into account that their decisions affect more than the internal working of their systems and more than the satisfaction of their visitors. Those decisions can make an enormous difference in the traffic to the sites that they index (and that they don't index), bringing some companies sudden success and destroying others. Those decisions can also lead to strange, unintended distortions in Web page design, as companies, in their struggle to survive, do their best to understand the underlying mechanisms of search engines and make changes intended to boost their search-engine-generated traffic.
But while standards get publicly aired and debated by bodies with representatives of the interested parties, the details of search engine design and their ranking algorithms remain shrouded, as proprietary trade secrets; and the designers can make changes whenever they please, without telling anyone beforehand or even afterhand. And those secret decisions can have enormous repercussions throughout the Web.
We have here a case where private business interests can collide with the good of the overall community, a case where the normal rules governing "trade secrets" might in the future be modified. That could happen by the search engines themselves recognizing their responsibility and sharing such information in ways they never have before, and seeking input and feedback from affected companies and individuals. They could do that publicly and individually as a way to enhance their image, or privately through participation, say, in the Worldwide Web Consortium, where design changes might be openly discussed, without full disclosure to the public -- giving an opportunity for experts to probe and seek to understand the business and technical implications and the possible unintended consequences, without giving away crucial proprietary information.With a few quick queries I soon established that they weren't looking for me at all. They were looking for themselves. They had gone to search engines and had entered their own name as the query. And since I have a lot of content at my Web site -- including lots of my writing -- many of my old friends are mentioned there. Searching for themselves, they chanced on me; and wound up sending me email.
If I had wanted to find them, I could have spent a lot of time looking and might never have succeeded. But because I had my own Web pages and, by chance, those pages had the right kind of content, and that content was indexed by search engines, the old friends found me instead.
The developers of AltaVista and other search engines intended to allow people to find answers to questions and to locate specific information that they need. But instead, it turns out that many people look first for themselves -- satisfying their curiosity about how often they, and others with the same name, are mentioned and what's said about them. Next they look for particular things that are near and dear to them -- often just out of curiosity, rather than need. It was this behavior and the fact that I had my own personal Web pages that led to me getting so many email messages from old friends -- them finding me by looking for themselves.
I soon realized that what I had done by accident, others could do deliberately -- setting out "flypaper" rather than going hunting with a "fly swatter." While hyperlinks are a way to point people away from your Web pages to other resources on the Internet, "flypaper" provides a way to draw people to your pages and encourage them to get in touch with you directly.
It's a neat flip of your usual expectations -- you connect with the people you want to by making their names and their subjects of interest findable at your site.
You can create Web pages and organize the content on those pages specifically for the purpose of drawing particular people and particular kinds of people to your Web site and hence getting in touch with them.
So how could a business use the flypaper approach? If you want to connect with a particular person, and your phone calls and email are going unanswered, create a Web page that mentions that person and topics that that individual is interested in. Say, on that page, all the good things you've been meaning to say about how you could both benefit from working together.
Be sure to put the person's name and the company's name in the HTML title and in the first line of text, so the ranking algorithms at AltaVista and other search engines will put your page high in the list of matches when people search for those words and phrases.
You needn't have hyperlinks from anywhere to your "flypaper" pages. Just be sure to submit the individual URLs to search engines, especially to AltaVista and InfoSeek, which are good about quickly adding material to their indexes. AltaVista will usually add your pages over night.
The next time the target person does a search for him or herself at AltaVista, your page is likely to appear at the top of the list. When that happens, that person may get in touch with you, and suddenly your position in the upcoming dialogue is greatly improved because they contacted you instead of you contacting them. There are no guarantees, but it's certainly worth a try; and the odds are getting better all the time as more people use the Internet regularly.
That's what I call targeted flypaper -- where you are trying to get in touch with one particular individual.
You also could try general flypaper.
For example, at my Web site I have a list of every book I've read for the last 41 years. It's just a list. When I posted it, I doubted that anyone would be interested. I posted it as a lark, for the fun of it. But because of search engines like AltaVista that Web page draws lots of traffic to my site. I've gotten email from authors, agents, editors and others who like the same books.
How can you apply this concept? Say you work for a school. Create a Web page that lists every alum and the year of graduation and other public info about them. Add URL at the search engines and you'll get email from some of them. As you begin to draw audience to your site with flypaper of this kind, you might give these people reasons for coming back, becoming a loyal audience -- part of a new on-line community.
However you decide to use flypaper, be open to the unanticipated value of saving, recording, and posting information of all kinds.
Then it gradually dawned on me -- why should they look for me? Just like me, they probably each have a hundred or more people who they once were close to (old roommates, business associates, etc.) who they've lost touch with. And why, out of all those others, should they actively come looking for me?
With a few quick queries I soon established that they weren't looking for me at all. They were looking for themselves. Yes, they had gone to search engines (most to AltaVista), and there they had done what most people do at those sites -- they had entered their own name as the query. And since I have a lot of content at my Web site -- including lots of mywriting -- many of my old friends are mentioned somewhere there, typically in the list of thank you's at the end of a book. Searching for themselves, they chanced upon me; and delighted at that unexpected occurrence, they sent me email.
So when the number of long-lost friends finding me starts to slow down, I should create a new page at my Web site where I mention folks I haven't mentioned elsewhere. In other words, instead of trying to systematically find other people on the Internet, I'll set things up to make it easy for those people to find themselves in documents at my Web site -- what I call the "fly paper approach."
It's a neat flip of your usual expectations -- you connect with the people you want to by making their names and their subjects of interest findable at your site. And the same approach could also work well in the world of business, when you are trying to connect with potential customers or potential employers.
Unintentionally, I've seen several business-related instances of this phenomenon over the last few months.
Ebooks Multimedia in San Francisco, maker of interactive CD ROMs for children, was looking for content that they could turn into product in time for this Christmas. Using search engines, they found my book The Lizard of Oz at my Web site. This is a book that I self-published22 years ago, and which had simply been gathering dust. Within about a week of their first contacting me, we had a signed contract, and they are now at work on the project.
Soon thereafter, a movie producer in Iceland looking for new material found my never-produced screenplay Spit and Polish. That's not likely to lead anywhere, but it's an opportunity that I would never have dreamed of pursuing actively myself.
In both those cases, instead of my having to identify prospects, write query letters, and submit manuscripts -- which takes time, effort, and money -- they found me. And because they made the first contact, the conversation started at a different level -- they had a particular need, and they had already determined that my work might fill it.
The most dramatic instance of this principle was totally unexpected -- a kind of opportunity that I would never have dreamt of.
A Gary Trudeau fan, looking for a copy of Bull Tales (Trudeau's first book, published back when he was an undergraduate at Yale), found it mentioned at my Web site in a list I have there of every book I've read over the last 38 years. He sent me email to find out if I still had a copy. He also noticed at my site that my daughter (now a sophomore at Sarah Lawrence) is into acting. It turns out that he is the writer/producer of several popular TV shows, and she was in LA over the summer acting in a movie written and produced by my sister Sallie. After a few friendly email messages, I wound up trading my copy of the book for my daughter to get an audition for a possible part in an episode of one of those TV shows. Nothing immediate is likely result, but she learned a lot from the auditioning experience and made contacts that could prove important in the future.
Another close relative has been looking for a new job. She did all the traditional things -- checking newspapers and job-related Web sites. She's had some interviews, but none of the opportunities looked particularly tempting. Then, out of the blue, she got a call from a headhunter who had found her resume on our Web site. A company located far closer to her home than her present job was looking for someone with her credentials. She's gone for a couple of interviews and it looks very promising and tempting. Regardless of how it turns out, this is another instance of how effective it can be to make yourself findable on the Internet.
So how could a business use the
flypaper approach? Instead of (or in addition to) pounding on
the doors of prospective customers or partners, set up pages
that mention them and the topics that are of most interest to
them. If possible, mention the names of the key individuals.
You needn't link from your home page to those flypaper pages
at all , but do link from the flypaper back to either your
homepage of other pages you'd particularly want those folks to
see.
Then the odds are good (and getting better all the time as more and more companies and people come onto the Internet) that some of them will find you while looking for themselves or looking for the topics of most interest to them, possibly leading to the kind of business contact you want.
And, of course the same approach could be effective in trying to recruit particular individuals to come and work for you, as well as trying to get particular employers to come looking for you.
So get to work -- set out your flypaper. It needn't take much time or effort, but the payoff could be significant.He's been paying a high price to have his Web site designed and maintained by a firm in England, and has not been getting much traffic. Typically, he asks me for advice, then forwards my answers to the design firm, which then replies, and I react to the reactions.
This three-way dialogue (paraphrased below) illustrates common misconceptions not only of Internet newcomers, but also of the design firms that serve them.
FREELANCE WRITER : My site isn't quite finished yet. I still have to add some of the articles that I have written, and I am waiting for the design firm to register my site with the relevant search engines.
ME: Search-engine registration something you should do yourself -- it is very simple (it's silly to pay someone to do it). And it's something you should keep on top of all time (e.g., submitting each and every one of your pages separately to AltaVista, and entering new and altered pages whenever you make changes).
Since your content is primarily text, you should design your pages yourself, as well.
Your current pages are pretty, but are virtually useless in terms of search engines -- so complex as to be search engine unfriendly.
You also should add much, much more content. Search engines only see text -- the more the merrier (I have over 1200 documents at my site, some of which are entire books).
FREELANCE WRITER: My long-standing, but still to be realized, aim in life is to direct people towards my site -- and to pick up work accordingly.
ME: That's basically what I do. In October, I had over 100,000 page views. I'm averaging 1200-1500 unique visitors a day -- all because of my content. (No advertising).
In addition, there are the benefits of building a reputation, establishing and cementing contacts that could prove valuable later, etc.
About once every two weeks, I'm contacted by one reporter/writer or another who wants to interview me as part of an article because they found related material at my site. (Yesterday, one from a newspaper in Modesto, California, about online shopping.)
Also, not all the "wins" are represented in cash. E.g., I translated two books by a Russian officer (Alexander Bulatovich) about his experiences in Ethiopia around 1900 (from the Russian). I was unable to sell the translation to a book publishers. (The publishers all said that there was no market for anything about Ethiopia, regardless of its merit). I then posted the full text of both books at my Web site. I got interesting email from people all over the world (including a grad student in Poland, who then based her doctoral dissertation on my texts). Then I got a snailmail letter from an elderly professor in Addis Ababa (Professor Pankhurst), who is known as the number one guru on that period of Ethiopian history. Someone had found my translation on the Web and printed the whole thing out and given it to him. He said that this really "must" be published. Shortly thereafter I got email from a professor in Bremen, Germany, who happened to be the great-grandson of the Emperor Menelik II (mentioned in the books), who was adamant that they must be published, and even offered to help the wouldbe publisher financially. With those two messages as ammunition, I went back to an editor who had previously rejected the manuscript, and he almost immediately accepted it. It was finally published last summer. As this is an "academic" book, I'm paid in copies and reputation.
Also, a play that I wrote just after college, back in 1970, and that has never been performed (Without a Myth), I posted at my Web site. I was contacted by an independent theater group in Spokane, Washington, who wants to put it on. They've scheduled it for this December, in the main public theater in that city.
See my article about "flypaper" for more such anecdotes.
FREELANCE WRITER: This seems to be beyond my own skills - which was why I asked this firm to create a site for me. As far as I understand matters, there is nothing worse than simply have it out there - and not properly registered at search engines and working for you. Otherwise its kind of like a brochure or pamphlet, however beautifully produced, sitting out there in your draw. With no one ever seeing it.
ME: Yes, but you should and can do all that for yourself -- and you'd do a far better job of it.
DESIGN FIRM: Your site is search-engine friendly (as indeed is the database).
ME: Yes, your design firm has set up a very slick and attractive site for you, with .asp pages assembled from elements in databases, based on javascript. But the only text on these pages that search engines can "see" is the brief metatags. Everything else is buried in the database. This approach is totally search-engine unfriendly.
DESIGN FIRM: Your site does not necessarily need to "contain much much more content". Search engines work on a density method. The number of times words appear on a page determines the relevance and how it scores on an engine. Too high and the page will not be listed, too low and it may not be found.
ME: That is absolutely untrue. Search engines, like AltaVista, assign little or no value for repetition of words. Plus, keep in mind that only the text in the metatags of your pages is visible -- very very little text. AltaVista, in particular, gives more weight to pages that are rich is useful text.
DESIGN FIRM: Just adding content does not increase the relevance.
ME: I'm not talking about single-word or single-phrase relevance. Search engines only "see" text. The more text you have, the more likely that people searching for the kinds of content you have will see one of your pages in the list of matches. Most people type phrases and series of words -- not just a single word.
Factors affecting relevance/ranking include: do the query words appear in the HTML title or the first couple lines of text? how large is the page? (the larger the better). etc.
DESIGN FIRM: I have prepared pages before that have had less than 50 words and scored 99.5% on Infoseek!
ME: That's useless information. It doesn't matter what kind of score you get at one particular search engine for a particular single-word or single-phrase query. What matters is how much traffic you get.
DESIGN FIRM: Were you to publish every article in full, you risk people poaching it and claiming it as their own. This does happen, so is one reason why I suggested going down the route of giving short descriptions in the database and then giving the article in full on payment.
ME: If you are paranoid, you can do periodic searches of the Web for sentences from particular texts -- if someone steals your property, you can sue. But don't be naive. Anyone can take printed text and scan it or type it in and post that. (Material on the Web is no more vulnerable to stealing that any other published material.) And, of course, anyone anytime anywhere can simply paraphrase what you have written and thereby legally appropriate it. Those risks have been around a long time. What the Web does is open opportunities for you to reach a wider audience, to become better known, and especially to become better known by people who might want to hire you to write new material.
DESIGN FIRM: True Internet marketing is a long and expensive process. The page I mentioned earlier took 9 weeks to score with that relevance and about 10-20 hours of development (one page, one search keyword). So, to do the job really thoroughly costs a lot of money.
ME: That is wasted time and effort, and money thrown away.
DESIGN FIRM: You an go to firms that do it on the cheap through economies of scale, but even then you would look to pay £500 - £1000 just for one or two search words. Essentially, the whole process is more than just search engine relevance, though. It really depends on your target market. In your case, this is editors who I would doubt will go around search engines too often looking for writers. More so, writers will come to them. If you can contact an editor with a link to your professionally finished site, you stand a much higher chance than with a wordy or poorly designed site. The principle is the same as a CV - short, concise, factual, and relevant.
ME: Even the advice about the CV (resume) is wrong. On the Web, long resumes are much better, get many more search engine hits and hence get you much more visibility (if they are properly constructed.)
FREELANCE WRITER: I just went to Google and looked for "Freelance writer and journalist" and my site came up quite high on the list. This is what I was after, in the first instance.
ME:The likelihood that someone would search for that exact phrase "freelance writer and journalist" is very slim. The best metric is how much traffic you get and how much business you get from that. The best leads are the ones that come from people doing unexpected searches that match with your content, and you get more of that if you have lots of content available at your site.
DESIGN FIRM: I have designed your site to be search engine friendly, with a density to focus relevance as much as possible within the budget you had available, and have listed to a large number of search enginesm particularly the big eight which most people use (Yahoo, AltaVista, Infoseek, Google, etc). Unfortunately, this process does take time so the results may not be apparent very quickly. I am continuing to monitor them, though.
ME: He's playing on your ignorance of the Internet and bleeding you.
DESIGN FIRM: As I mentioned to you before though, I think the majority of your business is still going to come from sending short marketing e-mails to editors with a link back to your site.
ME: As I detailed for you, I get significant business from editors finding my articles at my site -- people and publications that I had never heard of.
DESIGN FIRM: Lastly, there is more than one way to skin a cat as they say. So his approach is one approach. However, there are others. The way search engines score changes almost monthly and it is a dynamic process, so paying money to score highly one month, does not mean you will still be there the next month. I subscribe to a group of researchers who look into this and how search engines change every month, then give recommendations on page design, marketing techniques, etc. It really is an industry that doesn't stand still by any means!
ME: Sure there's more than one way to approach a problem, but his way costs a lot and doesn't generate traffic. You're much better off doing this stuff yourself. (His score-keeping method is irrelevant to your needs.)
I presume that you want to leave your existing pages in place and that what we will be doing will be adding new content or mirrors of some of your current content in more search-engine-friendly form. That would be my preference. These new pages would be simple and unadorned.
FREELANCE WRITER: So what does a cyber-goon such as myself make of all of this? All I know is that I now have a web site and I want it to work for me - as yours does for you. That is why I thought that it would be a good idea to be personally tutored by you - so that I could learn the ropes first hand.
I also take your point about the knock-on, or secondary advantages, of the web site. As Mrs. Thatcher once memorably remarked, it's a funny old world. (That was just after she got thrown out as leader of the Conservative Party, having won three consecutive general elections...) You never know where a web page can lead you. Or what opportunities can crop up. As you explained very clearly in your last note.
Maybe you and my web design man are in fact saying the same thing in different ways. I really don't know.
ME: No. We are diametrically opposed. The design techniques he is using produce pages that are "pretty", but do not generate traffic. Search engines love text.
To get started, first, you need to be able to connect to your site by ftp (file transfer protocol). (If you don't have it already, there's free software called ws_ftp that you can pick up from just about any large shareware site, like www.download.com and www.tucows.com). The automated database-driven system that your design firm has set up makes it impossible for you to do the kinds of things I recommend.You want a Web hosting service that allows you to simply upload your pages -- without a preset structure. What the design firm has set up for you takes control out of your hands, makes you dependent on them for running the site.
FREELANCE WRITER: I more or less told the chap what I wanted from the outset -- colors etc. -- and he put it together.
ME: I'm sure he gave you exactly what you asked for. The problems are: 1) you didn't know what to ask for, and 2) the techniques he used to produce the pages
If it were possible to upload new pages to your site using ftp, and if it were possible for you to directly edit the pages that are already there, then we could do good things.
In that case, I would recommend: