Archive for February, 2008
Being mainly a PHP backend programmer, I hate to see the design cripple the code. Therefore, I always make sure that stylish and standard controls end up sending the same data to the server.
In this post, I cover buttons, check boxes and radio buttons, and provide a simple PHP helper that automatically generates the code. You might also want to check out the demo page to have an idea of what it looks like.
CSS, HTML, Programming
Friday 29 February 2008
Recently, I faced the awkward Adobe Flash slowness on Mac and old PCs during the development of the web site of a client of mine.
After doing a little research and testing, I found out that some Mac are especially bad at alpha and scaling, among other things.
This post aims to centralize as much information as possible on the subject, because very little is available on the web and it’s unfortunately disparate. I’ll try to nail the facts and outline solutions.
Please add a comment or email me if you feel I’ve forgotten something or if you made a breakthrough discovery.
Flash, Web development
Thursday 28 February 2008
Am I crazy (my mom says I’m special) or this resembles a business-like Facebook? Pretty neat, nevertheless! Check out Linkedin Beta here.
Marketing, Social networks
Wednesday 27 February 2008
The Comcast affair (and recent China and Pakistan filtering) brings the spotlight on a Pandora box that must not be opened: net neutrality.
Network neutrality basically states that your internet service provider shouldn’t control what you have and don’t have access to on the internet.
Wednesday 27 February 2008
I found an article about static web design workflow by Josh at Tutorial a day. While his process is great for static web sites, it isn’t adapted to dynamic web design, with a PHP/ASP/DotNET/JSP/ColdFusion (yuk) backend that is.
I’ll try to briefly cover every step we — at Quantik Solutions — do to ensure the delivery of a web site that meet the client and the user needs. This being an overview, I will try to develop each step in a separate post later on.
Step 1. Market, history and present situation analysis
This basically consist in look to the client’s past and actual web site, and its competitors. The objective is too get a list of do’s, keepers and don’t, either from the functionalities of the analyzed sites, or from functionalities that are missing from the sites. We analyze available web statistics to get quantitative information about what informations the visitors seek the most and the less.
This step also consist in finding the characteristics the major personas:
- What are their demographics? (age, sex, education, revenue, etc.)
- What is their internet knowledge? (often based on their demographics)
- What information do they look for on the site, in the first 30 seconds, in first 2 minutes, in the first 10 minutes?
Step 2. Needs and requirement gathering
Usually — but strange enough not always — the redesign or refectory of a site is triggered by new requirements or new needs. The objective of this step is to get a list of all of them, but also to discover latent needs and requirements, or to convert requirements to strategies. We also try to get an idea of the client’s budget.
For example, a client could require to improve its web sales by 10%.
- A latent requirement could be that the checkout process is simpler, as it has 10 steps right now ;
- His 10% improvement requirement could be converted to an immediate or delayed up-selling strategy.
It might look like rocket science, but all the answers usually come quickly and easily after the market analysis at the previous step.
Usability, Web development
Wednesday 27 February 2008
Most web applications in this World 2.0 target international users distributed among many different timezones. It is pretty simple to ask the users what their timezone is when they registered to your site. This article aims to solve the case of unregistered users who you can’t ask in a usable way. The solution isn’t perfect, but it’s better than nothing.
In theory and technologically, we could determine what timezone the user is in by:
- Using a geo location database to guess their longitude and latitude, and then find the probable timezone from it ;
Geo location, while pretty simple nowadays with free databases floating around, leaves us with a problem: the free databases are often not good enough to pin point a precise longitude and latitude, which will cause garbage in and garbage out when trying to translate it to a timezone. Moreover, the non-free and precise databases are pretty expensive compared to the added value of timezone detection, and are most likely not worth buying just for this purpose.
Monday 25 February 2008
What better, on a saturday morning, than to write (or read will you say) about a PHP framework with a nice cup of java – the warm caffeinated beverage that is.
Saturday morning thought: the Java developers I know are like Jehova witnesses: they divert any programming conversation to “Java is the future, wake up from your slumber”. Java’s been the future for 10 years, yet it failed to make it into the present. But lets take another sip of coffee and digress no more…
While I personally (this is a personal blog after all) like Zend Framework for its lightness, speed and cleanliness, I rarely consider it for rapid development: I dislike the MVC, and tend to use Zend more like I use PEAR libraries. The documentation could be better too.
My love goes to CodeIgniter, a PHP framework developed by the guys at Ellislab. Setting up a CodeIgniter environment is easy as 1-2-3: Unzip (or untar if you’re a real geek), configure database, and take a sip of coffee (did I tell you I liked coffee?).
To the contrary of Zend, the documentation is a wonder: tutorial oriented while not cluttered by stating the obvious. The only thing that I could use sometimes is an API-style documentation, which is inexistent and leaves me off looking at their code.
The essential libraries are there, ready to be loaded and used: configuration, database, benchmarking, crypto, input, input validation, output, language, unit testing, and much more. Plus there’s a few very good contribs out there on their wiki and elsewhere on the web.
Enough propaganda, let’s be honest, CodeIgniter is not perfect:
- Configuration and language files are PHP declared arrays, which makes them kind of hard to maintain sometimes. Plus I’m worried (although I did no benchmark yet) about the cost of multiple file inclusion.
- The model object provides a lot of flexibility (actually, you do it the way you want), but flexibility is a double bladed knife. A real, functional yet flexible out of the box active record class would make rapid development rapider.
- The cookie-based session class sucks: All your data go in cookies + Cookies are size limited = Problems. But there’s proven and true replacements in the contribs.
- There is no “real world” caching class out of the box. Sure, there’s a caching class that caches, URL based, your output, and it works well if a URL always look the same no matter who visits it. But in that Web 2.0 world full of logins and “Hello Tommy”, it’s not the case. Yet, you can extend the output class by adding caching hooks.
- The framework is still fully PHP4 compatible, which tend to make its Object Oriented code ugly. Let’s hope they’ll drop that sooner than later.
Yet, CodeIgniter stays my first choice. There’s more pros than cons, and we built a lot of libraries over time that ease the downsides.
Saturday 23 February 2008
In case you didn’t stumble on this pretty neat list of CSS techniques, tips and tricks from Smashing Magazine, well here it is.
Here are a few good ones:
CSS, HTML, Web development
Friday 22 February 2008
Simplistic diagram showing content push/pull from user, and Digg + Google Trends integration
I came across a old video podcast of Sage Lewis yesterday talking about Google Trends, and an idea came to my mind (again).
If you don’t know about Google Trends, this nifty evil Google tool tries to detect trends in user searches. Some sort of Digg, but on another level:
- Digg shows content people found (either by pull or push) that interested them enough and that they wanted to share.
- Google Trends shows topics people were searching for (pull).
My experimental idea is: building a fully automatic opportunistic platform that will exploit these trends in order to generate traffic.
The three main components are :
- Be able to “sense” these trends on time
- Be able to give the users the content they are searching for
- Be able to have the search engines reference you for those topics rapidly
Sensing is easy in itself : Google Trends provides the information. However, more data must be brought in since Trends is only updated a few times daily. If we take American Idol for example, user searches only make it into Google Trends past a certain point, after a certain time. So we are definitely not going to catch the trend at it’s beginning. The goal is to catch it before its apogee.
Content acquisition could be achieved by using Yahoo or Google’s API for searching, or by having an human editor (but that’s kind of defeats the fully automatic opportunistic platform). Or it could be achieved by preselecting web sites for specific topics (such as a lyrics site, a sports news site, etc.), and matching them against trends.
Reference is perhaps the hardest part. Since we aim for most of the content to be automatically acquired, and the experiment being new, it’ll be hard to be rapidly considered for keywords by Google.
One of the paths I’m looking at is automatic content submission to social engines. There are very good chances that trended content we’ve acquired might be “Digged” soon or later. So why not tell Digg about it right now? The subject is in the air for the majority of the population, so it might be relevant to the Digg population (nb.: Digg != the world, my sister never goes digg.com, yet she’s part of the world).
The second path I’m looking at is search engine advertising. People will type in “that song lyrics” into Google. We know that they will, it’s a trend. “So why not automatically buy those keywords through the Google Adwords API?” say I with an evil grin on my face. This could work pretty well to get on the trend (image below, dart 1), then more traffic could be brought through sharing features such as “send to a friend”, digg, del.icio.us, etc. (dart 2) before search engines finally catch up (dart 3).
So these are my thoughts about how to try to exploit Google Trends information. Everything is still to be done.
Friday 22 February 2008
While reviewing technological Digg‘s on my cellphone yesterday, I stumbled upon this neat article from Jacob Gube of Six Revisions about how to manage feature creeps.
If you’re in web development, you’ve been hit by feature creeps at least once. Quoting Six Revisions, the concept refers to “unforeseen requests for additions and changes that are outside the project scope”.
While many think it’s a “beginner’s mistake”, I know many pros who get caught every now and then, especially with certain clients who are more sympathetic and creative than others.
Gube outlines 8 tips for managing it :
- Accept that feature creep will happen. Because needs are dynamic, and that you can’t stop ideas – good or bad – from popping in everyones head.
- Commit enough time to requirements-gathering. Because the “beginner’s” cause to feature creep is planning on incomplete requirements, thus omitting important ones and the features that comes with it.
- Giving a hand might cost you your arm. Because good faith and good will is noble, but giving away features creates an habit with the customer, and on-the-fly feature evaluation often occults later stages of development such as quality assessment.
- Be the devil’s advocate when changes are requested. The stakeholders will come with the pros of adding the feature, but rarely the cons. Make sure they are aware of them.
- Be task-oriented, not vision-oriented. A vision oriented deliverable like “Building a web site that will be findable by search engines” is an open door to feature creeps. Task oriented deliverables like “Building a web site with a Google Sitemap, intelligent Meta tags, and clean & valid XHTML/CSS” makes everything clearer to all stakeholders from production day 1.
- Shed the “Customer is Always Right” mentality. Because he’s not. If he always was, he wouldn’t hire specialists like you and me.
- Research before committing. Know what a specific feature addition involves in terms of time. I can’t think of a better example than Gube’s one: Before agreeing to migrate a web site from your servers to the client’s servers, you should first research to realize that his are Windows IIS 6.0 based and yours are Linux based, and that the migration will most likely require modifications to the programming.
- Realize that feature creep is a two-way street. You forget, they request, you accept. Enough said.
Shall you want to read the whole article on Six Revisions, you’ll find it here.
Update (Feb 22nd, 11:47 AM): I mistakenly attributed the Six Revision’s article to Martin Kingsley. Credits corrected, thanks to the real author, Jacob Dube, for pointing it out for me. Martin Kingsley is the author of the little orange cards’ picture.
Friday 22 February 2008