Kevin Steffer Outloud – web, business and opinions

29Jan/100

Review of trackingdata in Google Analytics cookie

Background

At the first meeting of IT Forum SIG (Special Interest Group) on Online Marketing. Marlene said that she could not understand why her traffic had increased on the source / media segment labeled (direct) / (none), representing unclassified traffic - what is this traffic and why is the traffic in Marlenes case increasing.

I have started to analyze the way Google Analytics track, and analyzed the way Google Analytics (GA) manages cookies, and I believe I have found an explanation for Marlenes problem which I will highlight here.

Google Analytics Cookie explanation

http://code.google.com/intl/en/apis/analytics/docs/concepts/gaConceptsCookies.html

GA creates a number of cookies in your browser with varying expiration time.

  • The information on unique user detection expire after 2 years (__utma).
  • The information on tracking expire until 6 months (__utmz).
  • The information about a current visit (visits) will expire after 30 minutes after the last pageview on the domain.
  • The information on "Custom Tracking" will expire after 2 years (_utmv).
  • The information on the "Website Optimizer" will expire after 2 years (__utmx).

A little more background

I now have an additional information about Marlenes problem. She believes that they have been changing the cookie expiration time and set it down to a day. My guess is that Marlene configuration has modified (__utmz) the tracking cookie into a expiration time for one day. And I can explain with a test I have done.

My analysis

This is how my cookie looked after I've deleted my cookies and gone directly into http://blogs.co3.dk/kevinsteffer/ by entering it into the addressbar
Note:

utmcsr=(direct)|utmcmd=(none)
Expires: 30. juli 2010 11:11:19

image

I'm going to http://twitter.com/kevinsteffer/status/8163389926 and click on my link http://bit.ly/8VZEmq and I just take a look at my cookie again

image

Now it was last updated:

utmcsr = twitter | utmccn = Spreading | utmcmd = social
Expires: 30 July 2010 11:17:16

Now I close my browser and open it again and enter the address: http://blogs.co3.dk/kevinsteffer/

Now I am interested in what Google Analytics logs of data and not so much interested in my Cookie

Just for info so it can be seen here that it is not updated my cookie:

image

Note:

utmcsr = twitter | utmccn = Spreading | utmcmd = social
Expires: 30 July 2010 11:17:16

Here's what Google logs on to me:

utmcc__utma=231311308.1243725033.1264713079.1264713079.1264713948.2;+__utmz=231311308.1264713436.1.2.utmcsr=twitter|utmccn=spreading|utmcmd=social;

Note:

utmcsr = twitter | utmccn = Spreading | utmcmd = social

So actually the wrong information - I did type the address in my browser - I have not clicked on any link with tracking code.

However, you must keep a little eye on other parts of the logging information:

utmcc__utma=231311308.1243725033.1264713079.1264713079.1264713948.2;
+__utmz=231311308.1264713436.1.2

This first line of information ends with .2 second information has .1.2 now I will see if I can get some of the information to give me a .3 by either clicking on a link on the site and create a pageview more maybe close browser and enter the address again, simply by generate a new visit.

By clicking on links nothing happens by logging information:

__utma=231311308.1243725033.1264713079.1264713079.1264713948.2;+__utmz=231311308.1264713436.1.2.utmcsr=twitter|utmccn=spreading|utmcmd=social;

Now I close the browser and enter the address: http://blogs.co3.dk/kevinsteffer/

Now I got it up to .3

__utma=231311308.1243725033.1264713079.1264713948.1264714521.3;+__utmz=231311308.1264713436.1.2.utmcsr=twitter|utmccn=spreading|utmcmd=social;

It does count something in one way or another, however, there is an information that does not change. The point here is that it counts the visit, but for same unique user.

Conclusion

Now I run a test where I have deleted my cookies and then gone into my blog with tracking code:

http://blogs.co3.dk/kevinsteffer/#utm_source=cookietest&utm_medium=direkte&utm_campaign=sig

I closed my browser opened it again and went to my blog http://blogs.co3.dk/kevinsteffer/ without a tracking code 5 times and have now generated a total of 6 visits.

image

As can be seen here in GA I have with a single click on a link with tracking information and with 5 times of directly typing the address into the addressbar generated 6 visits all with the tracking information for my very first visit since my cookie was made at my first visit (or a new because the old cookie had expired).

What does it tell us then?

It shows that I can generate 6 visits with the same cookie information even though I actually typed th address into the adressbar 5 times.

I do not really know what to think about it, my first thought is that this is confusing.In my previous view of thought I have seen the above figures as 1 user has clicked onto my site via the link with the tracking “cookietest / direkte" 6 times, but it is not the case, the user has just done it at least 1 time and has the within last 6 months not clicked on any other link which gives him a new tracking cookie.

We could calculate how many users there are behind these 6 visits of the cookie information "cookietest / direkte", but I can not infer anything based on my test data. So I will just test and write about it some other time.

In Marlenes case where they have set the cookie lifetime down to one day, in GA this will result in having a lot of old users who have lost their tracking cookie and then via bookmarks or typing in addressbar no longer count in their initial tracking information, but now just as ( direct) / (none) traffic.

Am I totally wrong on this or what do you think, or should I call myself Cookie Nerd in the future?

Filed under: Outloud No Comments
7Jan/102

Install Acquia Drupal via Microsoft Web Platform Installer

I selected Acquia Drupal in the Platform Installer and just leaned back and woops

webplatform_installer_drupal

And I tried to launch the site BLAHHHH 500 Internal Server Error

I’m on Windows 7 Profressional with IIS 7.5 and Microsoft Web Platform Installer 2.0

Error: PHP FastCGI can’t find php executable

Now what?

After looking at the settings for FastCGI in IIS7 i found out that php-cgi.exe wasn’t installed in folder C:\Program Files\PHP\ where settings pointed it to be!

Solution

I Downloaded PHP 5.2.12 (the version available via the Platform Installer) Installer version and within installation I could Change my PHP installation and simply added everything (all extensions, manual etc.), now you’ve got the php-cgi.exe in C:\Program Files\PHP\php-cgi.exe :)

I must admit installing ALL extension lead me to having all the extension enabled in php.ini. And I got errors when trying to run php.exe and disabled the extensions it said it was missing dlls for (e.g. OCI.dll, aspell-1.5.dll libcs.dll and more) just disable the extensions at the bottom of the php.ini file.

And know you’re ready to go finish you install with configure your site: http://localhost/acquia-drupal/install.php 

Happy Drupalling!

Filed under: Outloud 2 Comments
27Dec/091

Linq, Dataset, Recordset performance result

I’ve for a while now been wondering why I often run into bad performance when using LINQ to access my SQL Server database. So I decided to hunt down the beast and see if I come up with something.

In this first test I haven’t found the beast but I still found some interesting results that I want to share. The result confirms my experience of not using a Dataset to hold data, simply because in most cases it’ll be to time consuming an approach using the SqlDataAdapter Fill method to load a Dataset and here you can see why.

 

 

My test is simply to read out 1,000, 10,000 and 100,000 rows from a simple table and then measure how long it takes to the data out in the three different ways: With LINQ, as a Dataset and as a Recordset from the SQL Server.

Conclusion

The Recordset is twice as fast for small database executions

When working with small amount LINQ isn’t as fast as the Recordset, but in my case it’s now noticeable in a single execution. But when you have to run a lot of small jobs against the database you should keep in mind that the Recordset will do the job twice as fast.

Dataset should be used where many of the same data and used in variety.

When you have to count the number of rows fetched, pray to that you didn’t use the Dataset method. What tares the method down is that it has to load all data with all the properties into the Dataset, just so you can call the Count property on the Rows inside a table on the Dataset object. It just takes way too long time.

With 10,000 rows both LINQ and Recordset is twice as fast as the Dataset

When I increase the rows in the table with factor of 10 to 10,000 rows that test times are rising, which is fair enough because the database sends 10 times more data, but the times are somehow surprising me anyway. The LINQ method rises with a factor 4.5, the Dataset method with a factor of 10 and the Recordset method climes with a factor of 6.5.

With 100,000 rows LINQ begins to catch up with the Recordset

When I further increase the rows in the table with a factor 10 to 100,000 rows I would expect about the same factors, but now they rises even further. The LINQ method rises with a factor 8.7, the Dataset with a factor 9 and the Recordset with a factor 9.5.

Download the Visual Studio 2008 Project

Test system

Hardware: Laptop HP Elitebook, 3GB RAM, 2,53GHz Intel Core Duo, 32bit
Software: Windows 7 32bit, SQL Server Express 2009, Visual Studio 2008 Professional Developer Web Server

Test scenario

I have separated my test into 4 scenarios and each scenario is run 10 times with the same method to get an average value of the time consumption for my conclusion.

LINQ measure method

private void DoLinqTest()
{
    DateTime dtLinqStart = DateTime.Now;
    using ( DB.dbDataContext db = new DB.dbDataContext() )
    {
        // Execution code
    }
    DateTime dtLinqEnd = DateTime.Now;
    lblLinqTime.Text = ( dtLinqEnd - dtLinqStart ).ToString();
}

Dataset measure method

private void DoDataSetTest()
{
    DateTime dtDsStart = DateTime.Now;
    using ( SqlConnection dbCon = new SqlConnection( ConfigurationManager.ConnectionStrings[ "dbConnectionString" ].ConnectionString ) )
    {
        using ( SqlDataAdapter sqlAdap = new SqlDataAdapter( "SELECT * FROM item", dbCon ) )
        {
            DataSet ds = new DataSet();
            sqlAdap.Fill( ds );

            // Executing code
        }
    }
    DateTime dtDsEnd = DateTime.Now;
    lblDsTime.Text = ( dtDsEnd - dtDsStart ).ToString();
}

Recordset measure method

private void DoRecordSetTest()
{
    DateTime dtRsStart = DateTime.Now;
    using ( SqlConnection dbCon = new SqlConnection( ConfigurationManager.ConnectionStrings[ "dbConnectionString" ].ConnectionString ) )
    {
        dbCon.Open();
        using ( SqlCommand sqlCmd = new SqlCommand( "SELECT * FROM item", dbCon ) )
        {
            using ( SqlDataReader rs = sqlCmd.ExecuteReader() )
            {
                // Executing code
            }
        }
    }

    DateTime dtRsEnd = DateTime.Now;
    lblRsTime.Text = ( dtRsEnd - dtRsStart ).ToString();
}

1. Scenario

A test which seen in Testresult Sheet: 1000 List to get out a 1000 database rows as objects of type:

public class item
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string Description { get; set; }
    public DateTime Date { get; set; }

    public item()
    {
    }

}

Executing LINQ code:

List<item> objItems = ( from itm in db.items
                        select new item
                        {
                          Id = itm.id,
                          Name = itm.name,
                          Description = itm.description,
                          Date = itm.date.Value
                        } ).ToList();

Executing Dataset code:

List<item> items = new List<item>();
DataRow[] drs = ds.Tables[ 0 ].Select();
for ( int i = 0; i < drs.Length; i++ )
{
    object[] dbFields = drs[ i ].ItemArray;
    item itm = new item();
    itm.Id = ( int )dbFields[ 0 ];
    itm.Name = dbFields[ 1 ].ToString();
    itm.Description = dbFields[ 2 ].ToString();
    itm.Date = ( DateTime )dbFields[ 3 ];
    items.Add( itm );
}

Executing Recordset code:

List<item> items = new List<item>();
while ( rs.Read() )
{
  item itm = new item();
  itm.Id = rs.GetInt32( 0 );
  itm.Name = rs.GetString( 1 );
  itm.Description = rs.GetString( 2 );
  itm.Date = rs.GetDateTime( 3 );
  items.Add( itm );
}

2. Scenario

A test as seen in Testresult sheet: 1000 Count to get out the amount of rows fetched on a database table with a 1,000 rows.

Executing LINQ code:

int count = ( from itm in db.items
              select new item
              {
                Id = itm.id,
                Name = itm.name,
                Description = itm.description,
                Date = itm.date.Value
              } ).Count();

Executing Dataset code:

int count = ds.Tables[ 0 ].Rows.Count;

Executing Recordset code:

int count = 0;
while ( rs.Read() )
{
    count++;
}

3. Scenario

A test as seen in Testresult sheet: 10000 List with the same Executing code like the 1. Scenario, but this time with having 10,000 database rows to get out.

4. Scenario

A test as seen in Testresult sheet: 100000 List with the same Executing code like the 1. Scenario, but this time with having 100,000 database rows to get out.

Testresult

Download the Visual Studio 2008 Project

Filed under: .NET, ASP.NET 1 Comment
29Oct/090

Add Reference Dialog Improvements (VS 2010 and .NET 4.0 Series) – ScottGu’s Blog

YES! With VS 2010 the Add Reference dialog starts on the Projects tab instead of the .NET tabs which always has taken a considerable amount of time to load, very annoying when you want to browse the filesystem for a dll file to reference.

Great improvement Scott!

Add Reference Dialog Improvements (VS 2010 and .NET 4.0 Series) - ScottGu's Blog

Filed under: .NET, ASP.NET No Comments
12Oct/093

Best Practice Visual Studio with SVN, VCS and SCM

In my research on what others have of opinions and experience with subversion (SVN) or source control management (SCM) I have collected some good links of various kind – slide shows, red book and just experiences or ideas.

My focus on my research was dealing with Web development and specially ASP.NET development with Visual Studio along with third party CMS software.

The major pain is Visual Studio, when you open up your solution and don’t touch anything your “Solution.suo” file gets modified and with SVN you have a modified state of your project folder and you need to take action either commit the change or revert the file.

My conclusion on ASP.NET development with Visual Studio is

Add the following to your ignore list:

  • Solution\Project\bin
  • Solution\Project\obj
  • Solution\Solution.suo (hidden file)

Have your repository layout like this:

  • Project
    • branches
    • tags
    • trunk

Use the trunk for your “main-line”-development. With “main-line” I think of primary development that always stable and never has checked-in code that doesn’t build.

Use your branches for creating testing, experiments and development of larger features that should not break the trunk, but needs to be committed often for backup and history of file changes. Keep your branch in sync with the trunk, remember to regularly merge changes from trunk into your branch, this prevents you from “drifting” to far away from the trunk and that makes it much easier to merge your branch back into the trunk when time comes for that.

Use your tags for creating snapshots of your trunk or branch that goes into releases and is thought of as test solutions or the LIVE beasts that hits the production servers.

If you have questions, suggestions I’d very much like to hear from you and your experience with the subject – thanks in advance.

Subversion Best Practices Links

http://www.slideshare.net/mza/subversion-best-practices
http://electricjellyfish.net/garrett/talks/oscon2004/svn-best-practices/
http://svn.collab.net/repos/svn/trunk/doc/user/svn-best-practices.html
http://www.red-bean.com/fitz/presentations/2006-06-28-AC-EU-Subversion-best-practices.pdf
http://devnulled.com/content/2006/10/guide-and-best-practices-for-subversion-branching/
http://nedbatchelder.com/text/quicksvnbranch.html
http://daptivate.com/archive/2008/08/28/subversion-best-practices-for-web-applications.aspx

Filed under: ASP.NET, CMS, Web 3 Comments