Menu
Menu
Follow Us
 
PackFlash SEO | Version History
  
Version history for PackFlash Friendly URLs and SEO Module
- Indicates PackFlash Exclusive Feature

Version 4.03

  • Modifications to handle WebAPI on DNN 7 with private and public variables.

Version 4.02

  • Moved broken links to their own table and separated the process from the SEO table. Easier to understand with no broken links interfering with process.

Version 4.01

  • Enhanced caching makes everything faster. In many cases, over 90% speed improvement.

Version 4.00

  • Added ability to add/edit URLs and assign canonicals through Webservice API

Version 3.04

  • Added ability to modify SEO information "on page" with DotNetNuke 6+

Version 3.03

  • Added Re-writing Functionality across all PackFlash and 3rd party Modules

Version 3.02

  • Implemented Canonical URL management and automatic detection/grouping of duplicate content
  • Added cross-site functionality to handle content sharing with PackFlash Modules

Version 3.01

  • Added ability to manage 404 pages for each portal (except child portals with same domain) rather than universal for DNN installation
  • Streamlined setup process

Version 3.00

  • Integrate into Constellation package
  • CRAWLER IS NOW OPTIONAL. This version provides the ability to process DNN pages without running the crawler. This provides a way to start working on URLs right away, particularly for large sites where the crawl would take a long time.
  • The crawler can be set to run using the scheduler (legacy mode), on demand, or not at all.
  • Friendly URLs are generated dynamically ("on-the-fly") by browsing to each page. When a URL is requested, the system will see if it has already been recorded. If the URL is already in the database, it will skip it and deliver the page according to the settings. If the URL is not in the database, then it will create a friendly URL the first time that it a new URL is discovered. This will ONLY happen the first time a new URL is discovered.
  • Created a new setting that allows for a timeframe (default = 5 seconds). This setting establishes the amount of time that can be used to create the friendly URL if "auto-approve" is on. This means that if "auto-approve" is on, then the first time that a URL is visited, the system will spend up to 5 seconds to create a friendly URL and re-direct to it. If it fails (usually because of resources), then it will forward to the page like DNN would normally, and finish the process in the background. The next visitor and everyone after that will see the new friendly URL.
  • Added a new "delete URL" button functionality to each URL. This button removes the URL from the database completely and allows you to re-process, if applicable. This also allows for error URLs to be managed rather than sticking in the system. The previous "Remove URL" still exists as a way to move URLs to the system status.
  • Added a new "reprocess discard rules on existing URLs" button that allows for cleaning up URLs that might be causing confusion and are unnecessary. A good example is the code that is put after any URL to track marketing campaigns. This allows for bulk removal of "noise" URLs from the system.
  • Improved management of child portals domains.

Version 2.01

  • Fully tested on DNN 6 and DNN 6 compliant.

Version 2.00

  • Adds tools for searching URLs and managing duplicate pages. Very useful for large sites.
  • Less reliance on crawl for DNN pages. Friendly URLs for DNN pages come directly from database. Establishes "base" URLs, further reducing potential for duplicate URLs.
  • System will automatically generate a sitemap index file for sites with more than 40,000 URLs.
  • Adds the ability to edit the header of the Robot.txt file. PackFlash has established some baseline folders/files to include.
  • Better reporting - know how many URLs are causing problems and which ones as well as how many new URLs during the last crawl.
  • Added ability to include replacement character system - replace "X" character with "Y" character during friendly URL generation.
  • Additional improvements in memory management and CPU resources. Crawler has less strain on system.
  • Additional speed improvements through application optimization!
  • Module detects if there was a failure for ANY reason, even outside of the application. The application reports that there was problem, provides as much information as possible about the problem, and picks back up where it left off to finish the job of crawling the site. Fail-safe mechanism implemented to prevent system from continually failing.

Version 1.04

  • Provides the ability to have the DNN page structure in the URL. We will generate dashes in between the words for the paths as well.
  • Broken pages found during the crawl can now automatically be directed to a DNN page providing a way to handle 404 errors until someone has time to deal with or correct the broken links.
  • Added a button to "reset" the system or change the extension by restarting by deleting all of the data in the tables. Typically for testing purposes when just starting out.
  • Provides the ability to leave a portion of the language path in place, but simplifies it for the friendly URL. When selected, this feature will remove the "/language/" part of the url and leave the "en-us" portion, for instance.
  • MUCH improved user interface uses extensive AJAX/jQuery to give immediate feedback on changes within the dashboard when managing friendly URLs.
  • Provides the option to keep child portal paths in the URL if preferred.
  • Added a search to the "Manage Friendly URLs" page. Very useful feature!
  • Added the ability to "Remove URL" from the URL list. This places the URL in a group called "Manually Removed". This provides the ability to handle non-essential URLs found in the crawl.
  • Improved "Transform/Re-direct to another URL" function that provides more information and always has the tabid available as a reference.
  • Includes the ability to automatically group URLs of a particular page together that have the same search engine title, keywords, and descriptions to find duplicates. Any individual URL that gets grouped can be separated from the group by pressing the "make unique" button. This provides an automated way to group duplicates and still allow the ability to deal with exceptions.
  • XML sitemap and robot.txt file are now temporarily stored in a non-critical location to give the administrator time to review the files before making them live. Existing sitemap and robot.txt file are not over-written until requested.
  • Page performance improvements through database optimization. Page load faster!
  • Crawl performance improvements through database optimization. Crawl runs faster! Typical results = 2500 URLs crawled in under 5 minutes. Many implementations are much faster.
  • Improved crawler function follows more re-directs to find final location - more accurate results from the crawl!
Enterprise-Level Features
PackFlash| Superior DotNetNuke Solutions| 415 N. Lasalle Street, Suite 205, Chicago, IL 60654| success@packflash.com
Home | DotNetNuke Modules | Skins & Design | Website Services | Demos | About Us | Contact Us | Blog
Copyright 2019 by PackFlash