Jump to content


Discussing Web Design & Marketing Since 1998


Admin - Top Level
  • Content count

  • Joined

  • Last visited

  • Days Won


iamlost last won the day on March 6

iamlost had the most liked content!

Community Reputation

1,181 Excellent

About iamlost

  • Rank
    The Wind Master

Recent Profile Visitors

152,441 profile views
  1. Extremely. It's amazing how few folks pay attention and optimise for conversion particularly multi-page click tracks. not that I'm about to complain that 90% plus competitors think optimizing for Google is enough. Or that anothe 9% plus think optimizing each page separately is conversion nirvana. The longer the click track that leads to conversion the more opportunities to fine tune and maximise ROI. And offer options that pickup dropouts along the way.
  2. First off that is an extremely difficult niche to enter. The enterprise level competition is fierce and dominates. And Google is increasingly entering that space directly pushing even the big OTAs down the SERPS. Secondly, these days new sites in SEs tend to take quite a while to gain a foothold, often up to a year and a half. Lastly, simply putting up a site and waiting for links and ranking is a decade out of date. One needs to look beyond mere search, especially in competitive niches, and engage with, market to audiences where ever they might be such as various social media platforms. Note: or buy advertising to drive traffic and interest. Not that I ever have but in some situations it is almost required.
  3. Over time some very dominent content to content click tracks were developed. And a good many were largely ignored and dropped. Many are four to six links (and pages) deep/broad, some are half dozen to a dozen. After a while one gets a feel for the various information flows that may appeal to visitors however some don't make all that much linear sense. Sometimes it seems to just be a matter of shiny! and people take a tangent. A browsing deflection.
  4. I know several parks where they waited to see where the paths developed and then incorporated them into design ... Rarely a right angle, often long stretches wandered a bit rather than being straight. Organic walkways. I've always tried to offer as many natural browsing, reading, researching in content tracks as I can to balance normal hierarchal navigation. Over time it's built up into a series of interesting click tracks while increasing page views and time on site. Unfortunately it's a matter of trial and error rather than beaten path...
  5. There is a general misunderstanding of how algorithms such as Googles work. Put very simply an algo change is one of the following: 1. a change in threshold of one or more inputs. This is the most common, when G says they change things daily this is probably what they mean. To use the historic PR example: each link has a dampening factor of 0.15. But today it gets changed to 0.125 or 0.2. Note: it is probable that various types of links from/to certain niches/verticals have multiple inputs and thresholds. Not that links are the alpha omega of G's algo, just the most marketed. 2. a change, i.e. an addition or a deletion or a modification, of an input. This is much rarer and when it occurs is probably far more likely to be the addition of a new input. Note: otherwise we'd still be at ~200 different inputs arther than ~10,000+. Of course with Google their algorithm is not actually one single such. Rather it is a core algorithm with additional, depending on additional factors such as specific to a particular niche/vertical, precedent and subsequent algorithms feeding in or otherwise modifying data. This last G update that Marie references has been said by JohnMu to be a core update to help missed deserving sites. To me this means that G noticed they had some thresholds in the main part of the algo set too high/low causing too many false negatives (you aren't 'quality' when you are) and made adjustments. Being core algo changes it caused a broad shift in query results aka was widely, across niches, noticeable. To quote glyn: I ignore all this stuff so much. It is simply SE business as usual. However, SEOs need to market their google expertise and SEO software developers need to sell some rationale for their tools so webdevs get dramatic 24/7 algo updates while the very climate changes largely unmentioned.
  6. It is quite fascinating, the number of 'dirty' little secrets that exist(ed) in web advertising. Perhaps the greatest longest running con was the 'impressions' gambit, however the subsequent '1/2 viewable for 1-second' replacement still means that one is paying for less than 5% of what is actually being 'seen'; 'viewable' and 'seen' not being synonymous. Now comes Ebiquity’s Re-Evaluating Media report done for Radiocentre. It evaluates the UK's 10 major media channels: direct mail, magazines, newspapers, online display, online video, out-of-home, radio, social media, and TV, by comparing the perceptions of advertisers and ad agencies on the various advertising media with the reality (based on third party data of targeting, ROI, emotional response, and brand salience) of what each of these channels actually brings. First: how the advertisers and agencies ranked the channels: 1. TV 2. online video 3. social media 4. out of home 5. cinema 6. radio 7. newspapers 8. direct mail 9. online display 10. magazines Second: what 'reality' has to say: 1. TV 2. radio 3. newspapers 4. magazines 5. out of home 6. direct mail 7. social media 8. cinema 9. online video 10. online display So, if the beliefs and realities are so very different why do the 'experts' get things so 'wrong'? Perhaps... * traditional media commissions average ~3%. * digital media commissions average ~9%. ...has a bearing?
  7. I've always thought Rand an intelligent interesting person, someone I'd love to chat with over dinner; that said I've always treated his SEO advice (and that of SEOmoz/Moz) as SEO platitudes, initially informing folks of last years practice and this years gossip, learning that (SEO) consulting doesn't scale, latterly selling tools that do all sorts of fascinating stuff that are increasingly 5-10 years past best before date - if they ever were. Given his prospective new business I do wonder how much of his 60% upset on leaving is Moz's disinterest in going beyond Google and SEO at a time when increasing segments of webdev are looking elsewhere aka diversifying beyond the almighty G. Me thinks Moz is in danger of becoming a buggy whip company. None of which takes away from his accomplishment in building both his personal brand and his company; both quite extraordinary feats of dedication, perseverance, and hard work. When ever I question his 'reality' (or his current hipster image) I remember that Geraldine DeRuiter thinks he's someone special. Whatever else, Rand is truly fortunate in the women in his life.
  8. Google a Flutter

    Sorry, I'm (1) not about to earn yet another language (Dart) and (2) not about to enter G's Parlour no matter how enticing the newest shiny offering. Plus, given G's track record anyone wanna bet on the service level? Guess what? I already write fully cross platform with C++, no framework required. Ya, there's less of the pretty but I created my own personal GUI that works just fine thank you. And far less overhead all round. Gotta love the continual reinventing of the wheel. Look! Shiny!
  9. Glad you got this! I was on the road and for some reason it wouldn't copy and paste from my mobile. Excellent synopsis of what G has been doing at hoarding both content and data. DrP started his history too late imo - Universal Search in 2007 was the beginning and even that had been trialed for a couple of years prior. And the first 'let's see how far we can go before they scream' was G's image search. Otherwise a very very well done timeline and explanation. Hysterical that at no time was there mention of webdevs diversifying traffic sources. Of course when one's business model is tools about G... What is fascinatingly funny, however, are the comments (other than Will's feed lines) of the Mozzer's to whom, apparently this all is pretty much truly shocking 'news'. Talk about an oblivious demographice. As Kim mentioned, Cre8 members have been discussing each of the 'developments' as they occurred. I've never thought much of most SEO 'tools' and even before I read this morning about Rand's leaving I have been wondering how long most of them including Moz's will keep their pseudo-relevance. And Rand's new, more SM focussed, audience chasing tool in development only emphasises that. Gotta love this industry! Never a shortage of popcorn moments!
  10. I posted this over at WMW but it may be a concern of interest for readers here as well. Of course, there are a good many sites who do little to no PII collection and even less usage, those that anonomise sufficient to a 'get out of GDPR' card but for others the fuss I've been reading over tech requirements is only half the 'problem'. You may satisfy the EU bureaucrats but... what about the visitor? Are you scaring them or selling the value of your behaviour?
  11. iamlost says to not call it AI

    The 'journalism' bots have been live since 2009 at least beginning with easy factual data into frameworks such as sports scores. What is hysterical is that media is once again chasing their tails; back in both the 80s and 90s newspaper companies did fabulous R&D on news/stories over the Internet (80s) and web (90s) then dropped the ball much like Xerox Park in the 70s. There has been computer generated text content live on sites for almost two decades - some webdevs have been doing stuff since then that is only becoming 'public' in the past few years. The Google book project and project Gutenberg made huge OCR advances that increased efficiencies several fold. The reason such private efforts are ummm private is because if for example EGOL can create 100 high quality content pages but a computer methodology can create 500, 5,000, 50,000 equivalent in the same time period there is a substantial competitive advantage. Plus if they could identify such sites the SEs might well apply 'unnatural' penalties. What is new however is the off the shelf ability to seamlessly 'photoshop' heads et al onto bodies in motion or into backgrounds in videos. Add in audio created from samples and 'but it isn't me' takes on a whole creepily serious aspect. What happens when audio visual isn't 'evidence'? Can no longer be accepted as showing reality? The digital revolution is going to increasingly impact society in ways that are difficult to predict.
  12. What is a Micro Moment?

    OMG! Kim, Kim, Kim... That whole thing is simply a Google marketing exercise taking ye olde well known marketing points and packaging to sell the necessity of (1) Google AdWords, (2) Google Search, and (3) Google AdWords + Google Search.
  13. Guard Your Crypto

    There are two sides to this: Note: current estimate is that over 3-million sites are mining via visitors' computers/phones. Note: this is especially a problem with phones overheating at continued 100% CPU usage. 1. if the site being visited is upfront and says that it is mining using visitors computers it then becomes a choice similar to ad blocking: it's just another revenue stream that may be irritating but is also similar to making a donation. Best practice would be for the site to also offer opt-in, opt-out choice along with notification although I'm not aware of any doing such. 2. if the site is simply mining without notification I consider it having crossed the line and would seriously consider leaving and never going back as well as giving negative reviews. As of now I believe Opera is the only browser (version 50) with built in anti-cryptocurrency mining (in it's ad blocker). For many of the rest of us the NoCoin [github] browser extension (for Chrome, for FireFox, for Opera) is an open source consideration. Not available for Edge, IE, Safari. As is redirecting known (requires manual updating) miners to null via hosts file: * Windows: C:\Windows\System32\drivers\etc -> hosts * Linux: sudo nano /private/etc/hosts * Mac: sudo nano /etc/hosts -> ENTER -> TYPE admin pwd -> ENTER * Android: ---step 1: /system/etc -> LONG-PRESS hosts -> TAP Menu -> SELECT Properties -> IN Permissions TAP change -> TAP check box for Group under the Write column -> TAP ok -> TAP Cancel -> REBOOT. ---step 2: /system/etc LONG-PRESS hosts -> TAP More -> TAP Open -> Tap Text Note: there are apps available to do this...caveat emptor. and add to eof: [ coin-hive.com ] Note: plus any other miners you encounter... And of course there is NoScript as so far all miners I've seen work via JavaScript, the kiddies' choice.
  14. Yes, I have a rather limited white list of allowed crawlers and even they are confirmed via RDNS and kicked if they transgress allowed conditions. With the uptake of 'the cloud' simply blacklisting data centres has become increasingly counterproductive so now I primarily variously block on user-agent which gets the idjit 50% and on behaviour, which gets most of the rest - eventually. The IAB Tech Lab open sourced a crawler just for ads.txt so there will be a zillion permutations. Fortunately, ads.txt is a text file at root and there is no requirement/need for it to do other than request ads.txt, which means that further/other requests can be denied.
  15. ads.txt is to assist G et al properly attribute ads; yes there is a whole lot of ad misrepresentation, misattribution slight of digital hand going on and as is often the case G et al are unable to actually tell the good from the bad from the ugly. Note: millions yes millions of fake ads (at ~10 per second!) have been and still are sold on exchanges, i.e. AdEx, AppNexus, BrightRoll, PubMatic, via spoofed publisher sites, etc. Because so many ads are sold programmatically these days (80% of third party display ads) rather than turn off such a profitable tap they went for an algo friendly solution: ads.txt. That it puts the solution cost/effort onto publishers is just icing on the profit cake. If you do NOT use third party advertising (that requires it) then there is no point to include ads.txt. Just as if you do NOT use Apache there is no point to include apache2.conf/httpd.conf/.htaccess --------------------------- I still have AdSense (on less than 10% of pages) so I added ads.txt. Should I ever drop AS altogether and go strictly direct ad/af I'll remove it.