Tracking source and subsource codes in links?

Options

I placed a source and subsource link (our first one on the site) on our home page in a prominent position. When I go to see how many clicks occurred over the holiday weekend, it does not show up in AWS.

I am aware it may not have been click at all, but I highly doubt that given its position and message. Do the sources/subsources not show up in the tracking? Did I use them incorrectly? How would I use these codes, moving forward?

I was led to believe I could use these codes to give me a better idea of who is clicking on what or where, and yet this particular instance does not show up in the list of pages accessed on the site. I even tested it a couple times when live, so it has been clicked on...

Thank you.

Tagged:

Comments

  • Hey jdp,

    I was specifically thinking of using report writer to track the source and subsource -- since that is what I assumed the original personal you talked to was discussing.

    What are you using to track the clicks? What is AWS?

  • Adrian Cotter:

    Hey jdp,

    I was specifically thinking of using report writer to track the source and subsource -- since that is what I assumed the original personal you talked to was discussing.

    What are you using to track the clicks? What is AWS?

    AWStats is the integrated website tracking engine Convio provides:

    Report Center > Site Management Reports > Web Usage Reports

    Use the username and password it gives you at the top of the page there where it say "Important information to access Web Usage Reports"

    AWStats isn't the best analytics package out there, but it's not a bad one either and it's free and open source. We don't get admin rights to the Convio one. I REALLY wish we could get FTP access to our log files though, only having then accessible via HTTP one at a time is crazy. I'll see if I can come up with a way to automate that.. unless someone already has one -- I'd love to hear it if so!!

  • Sorry, I didn't give answer to the original post. AWS isn't the best overall at tracking anything in the query string. I suggest you use Google Analytics or the like, and/or download the raw log files and use a log analyzer package locally, and get at the data that way if you're looking for clicks. Likely, convio will only register data that report writer can get at when a person actually does some sort of Interaction that allows source and subsource to be recorded. It could be that the database records every new session as an interaction (I doubt it), but that would only be for logged-in users anyway, not organic web traffic.

    You need to use an web analytics engine to track website clicks, nothing that Convio provides can really do that effectively. They have Google Analytics productized if you want to go that route. There are other services that are page-tag based, and there are lots that are log file based, and there are pros and cons to both types.

    CARE has a local install version of WebTrends Marketing Lab 8 and we have a WebTrends SDC server as well (that's the same as the page-tag based WebTrends OnDemand service, except we run it on our own server). We also pull stats from Google Analytics, Google Adwords and Google Conversion.. something or other on top of the analytics we get from the Convio system. As I mentioned above, though, the one peice we don't have in place yet is an *automated* download of raw logs for local analysis. If they were available via FTP, it wouldn't be a problem.

  • Michael :

    Sorry, I didn't give answer to the original post. AWS isn't the best overall at tracking anything in the query string. I suggest you use Google Analytics or the like, and/or download the raw log files and use a log analyzer package locally, and get at the data that way if you're looking for clicks. Likely, convio will only register data that report writer can get at when a person actually does some sort of Interaction that allows source and subsource to be recorded. It could be that the database records every new session as an interaction (I doubt it), but that would only be for logged-in users anyway, not organic web traffic.

    You need to use an web analytics engine to track website clicks, nothing that Convio provides can really do that effectively. They have Google Analytics productized if you want to go that route. There are other services that are page-tag based, and there are lots that are log file based, and there are pros and cons to both types.

    CARE has a local install version of WebTrends Marketing Lab 8 and we have a WebTrends SDC server as well (that's the same as the page-tag based WebTrends OnDemand service, except we run it on our own server). We also pull stats from Google Analytics, Google Adwords and Google Conversion.. something or other on top of the analytics we get from the Convio system. As I mentioned above, though, the one peice we don't have in place yet is an *automated* download of raw logs for local analysis. If they were available via FTP, it wouldn't be a problem.

    FYI. I just finished a Perl script that does what I just said:

    1. Logs into the Convio Web log URL -- you have to configure the login info in the script.

    2. Downloads all the already-compressed files from the RAW list. It ignores the ones Convio hasn't already gziped, it also ignores any you might have already downloaded. The comparison is very basic -- files exists or not. It doesn't do any comparison, but if you find a file is corrupt or incomplete, just delete it and re-run the script.

    You have to configure your local log directory, hostname and authentication credentials yourself in the file. Also, the machine it runs on has to have Perl installed, along with the LWP::UserAgent module.

    Anyway, if anyone is interested, let me know, I'll be glad to share.

Categories