CheapWindowsHosting.com | Best and cheap Node.JS Hosting. In addition, Visual Studio 2015’s installer has an option to install Node.JS as part of its regular install in order to support the Gulp and Grunt task runners that are built in. However I ran into an issue today in which I updated Node.JS outside of Visual Studio, but since VS uses its own install that is separate from any outside installation, you can potentially run into a node_modules package dependency issue where one version of npm installs a package (which makes it rely on that version of Node/npm), and then you can’t run commands in the other version (they break). Specifically, I had an issue with node-sass and windows bindings. The solution was to point Visual Studio to the version of Node.JS that I had already set up externally to Visual Studio. Here’s how to synchronize them:
And that’s it! Now you’re all synced up! Having two separate installs is really confusing. If you’re starting out with JUST the VS Node.js version, you’ll eventually come to a point where you may update node.js by installing it outside VS, causing it to get out of sync anyway. If you’re a veteran Node.js person, then you’re already using Node outside VS and will need to do this anyway. It seems like there should be better documentation or indicators to show what version VS is using so this is more apparent.
Hope that helped. Did this fix it for you? Do you have a better way of keeping this in sync or a plugin/tool to help out? Let us know in the comments!
[su_button url=”http://asphostportal.com/Nodejs-Hosting.aspx” style=”3d” background=”#ef362d” size=”6″ text_shadow=”0px 0px 0px #ffffff”]Best OFFER Cheap Node.JS Hosting ! Click Here[/su_button]
CheapWindowsHosting.com | Best and cheap ASP.NET hosting. In this post we will a library for ASP.NET Core, that will add support of criteria for global action filters. ASP.NET Core have ability to register filters globally. It’s works great, but sometimes it would be nice to specify conditions for filter execution and FluentFlters will help with this task.
Project on GitHub: https://github.com/banguit/fluentfilters
What should you do ?
For ASP.NET Core Web Application you should use FluentFilter version 0.3.* and higher. Currently latest version 0.3.0-beta.
To install the latest package you can use Nuget Package Manager in Visual Studio or specify dependency in project.json file as shown below and call for package restore.
{ //... "dependencies": { //... "fluentfilters": "0.3.0-beta" }, //... }
After installing the package to your ASP.NET Core Web Application you should replace default filter provider by custom from library. Your Startup class should looks like shown below:
// Startup.cs using FluentFilters; using FluentFilters.Criteria; namespace DotNetCoreWebApp { public class Startup { //... public void ConfigureServices(IServiceCollection services) { //... services.AddMvc(option => { option.Filters.Add(new AddHeaderAttribute("Hello", "World"), c => { // Example of using predefined FluentFilters criteria c.Require(new ActionFilterCriteria("About")) .Or(new ControllerFilterCriteria("Account")) .And(new ActionFilterCriteria("Login")); }); }); // Replace default filter provider by custom from FluentFilters library Microsoft.Extensions.DependencyInjection.Extensions.ServiceCollectionExtensions.Replace(services, ServiceDescriptor.Singleton<IFilterProvider, FluentFilterFilterProvider>()); //... } //... } }
To register filters with criteria, you need do it in usual way but calling extended methods Add or AddService. Below you can see signature of these methods.
// Register filter by instance void Add(this FilterCollection collection, IFilterMetadata filter, Action<IFilterCriteriaBuilder> criteria); // Register filter by type IFilterMetadata Add(this FilterCollection collection, Type filterType, Action<IFilterCriteriaBuilder> criteria) IFilterMetadata Add(this FilterCollection collection, Type filterType, int order, Action<IFilterCriteriaBuilder> criteria) IFilterMetadata AddService(this FilterCollection collection, Type filterType, Action<IFilterCriteriaBuilder> criteria) IFilterMetadata AddService(this FilterCollection collection, Type filterType, int order, Action<IFilterCriteriaBuilder> criteria)
To specify the conditions, you should set the chain of criteria for the filter at registration. Using criteria, you can set whether to execute a filter or not. The library already provides three criteria for use:
For one filter, you can only specify two chains of criteria. These are the chains of criteria that are required and which should be excluded.
option.Filters.Add(typeof(CheckAuthenticationAttribute), c => { // Execute if current area "Blog" c.Require(new AreaFilterCriteria("Blog")); // But ignore if current controller "Account" c.Exclude(new ControllerFilterCriteria("Account")); });
Chains of criteria are constructed by using the methods And(IFilterCriteria criteria) and Or(IFilterCriteria criteria), which work as conditional logical operators && and ||.
option.Filters.Add(typeof(DisplayTopBannerFilterAttribute), c => { c.Require(new IsFreeAccountFilterCriteria()) .Or(new AreaFilterCriteria("Blog")) .Or(new AreaFilterCriteria("Forum")) .And(new IsMemberFilterCriteria()); c.Exclude(new AreaFilterCriteria("Administrator")) .Or(new ControllerFilterCriteria("Account")) .And(new ActionFilterCriteria("LogOn")); });
If using the C# language, then the code above can be understood as (like pseudocode):
if( IsFreeAccountFilterCriteria() || area == "Blog" || (area == "Forum" && IsMemberFilterCriteria()) ) { if(area != "Administrator") { DisplayTopBannerFilter(); } else if(controller != "Account" && action != "LogOn") { DisplayTopBannerFilter(); } }
To create a custom criterion you should inherit your class from the FluentFilters.IFilterCriteria interface and implement only one method Match with logic to making decision about filter execution. As example, look to the source code for ActionFilterCriteria:
public class ActionFilterCriteria : IFilterCriteria { #region Fields private readonly string _actionName; #endregion #region Constructor /// <summary> /// Filter by specified action /// </summary> /// <param name="actionName">Name of the action</param> public ActionFilterCriteria(string actionName) { _actionName = actionName; } #endregion #region Implementation of IActionFilterCriteria public bool Match(FilterProviderContext context) { return string.Equals(_actionName, context.ActionContext.RouteData.GetRequiredString("action"), StringComparison.OrdinalIgnoreCase); } #endregion }
CheapWindowsHosting.com | Best and Cheap windows Hosting. By default IIS and ASP.NET aren’t configured as part of a Windows setup (for obvious reasons) so developers are used to having to register IIS manually before being able to run and develop ASP.NET web sites on their desktops.
This no longer works and requires a different command. Depending on what you already have enabled this may work
dism /online /enable-feature /featurename:IIS-ASPNET45
If you haven’t enabled anything related to IIS yet you can do that at the same time with:
dism /online /enable-feature /all /featurename:IIS-ASPNET45
However! That might not appear to solve the problem even when it has! A post from Microsoft makes a bug apparent:
After the installation of the Microsoft .NET Framework 4.6, users may experience the following dialog box displayed in Microsoft Visual Studio when either creating new Web Site or Windows Azure project or when opening existing projects.
Configuring Web http://localhost:64886/ for ASP.NET 4.5 failed. You must manually configure this site for ASP.NET 4.5 in order for the site to run correctly. ASP.NET 4.0 has not been registered on the Web server. You need to manually configure your Web server for ASP.NET 4.0 in order for your site to run correctly.
NOTE: Microsoft .NET Framework 4.6 may also be referred to as Microsoft .NET Framework 4.5.3
This issue may impact the following Microsoft Visual Studio versions: Visual Studio 2013, Visual Studio 2012, Visual Studio 2010 SP1
Select “OK” when the dialog is presented. This dialog box is benign and there will be no impact to the project once the dialog box is cleared. This dialog will continue to be displayed when Web Site Project or Windows Azure Projects are created or opened until the fix has been installed on the machine.
Microsoft has published a fix for all impacted versions of Microsoft Visual Studio.
Visual Studio 2013 –
Visual Studio 2012
Visual Studio 2010 SP1
CheapWindowsHosting.com | Best and cheap cloud hosting server plan. In some respects cloud servers work in the same way as physical servers but the functions they provide can be very different. When opting for cloud hosting, clients are renting virtual server space rather than renting or purchasing physical servers. They are often paid for by the hour depending on the capacity required at any particular time.
Traditionally there are two main options for hosting: shared hosting and dedicated hosting. Shared hosting is the cheaper option whereby servers are shared between the hosting provider’s clients. One client’s website will be hosted on the same server as websites belonging to other clients. This has several disadvantages including the fact that the setup is inflexible and cannot cope with a large amount of traffic. Dedicated hosting is a much more advanced form of hosting, whereby clients purchase whole physical servers. This means that the entire server is dedicated to them with no other clients sharing it. In some instances the client may utilise multiple servers which are all dedicated to their use. Dedicated servers allow for full control over hosting. The downside is that the required capacity needs to be predicted, with enough resource and processing power to cope with expected traffic levels. If this is underestimated then it can lead to a lack of necessary resource during busy periods, while overestimating it will mean paying for unnecessary capacity.
Below are the key benefits of cloud servers:
Cloud hosting services provide hosting for websites on virtual servers which pull their computing resource from extensive underlying networks of physical web servers. It follows the utility model of computing in that it is available as a service rather than a product and is therefore comparable with traditional utilities such as electricity and gas. Broadly speaking the client can tap into their service as much as they need, depending on the demands of their website, and they will only pay for what they use.
It exists as an alternative to hosting websites on single servers (either dedicated or shared servers) and can be considered as an extension of the concept of clustered hosting where websites are hosted on multiple servers. With cloud hosting however, the network of servers that are used is vast and often pulled from different data centres in different locations.
Practical examples of cloud hosting can fall under both the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) classifications. Under IaaS offerings the client is simply provided with the virtualised hardware resource on which they can install their own choice of software environment before building their web application. On a PaaS service, however, the client is also provided with this software environment, for example, as a solution stack (operating system, database support, web server software, and programming support), on which they can go straight to installing and developing their web application. Businesses with complex IT infrastructures and experienced IT professionals may wish to opt for the more customisable IaaS model but others may prefer the ease of a PaaS option.
A development of the concept of cloud hosting for enterprise customers is the Virtual Data Centre (VDC). This employs a virtualised network of servers in the cloud which can be used to host all of a business’s IT operations including its websites.
The more obvious examples of cloud hosting involve the use of public cloud models – that is hosting on virtual servers which pull resource from the same pool as other publicly available virtual servers and use the same public networks to transmit the data; data which is physically stored on the underlying shared servers which form the cloud resource. These public clouds will include some security measures to ensure that data is kept private and would suffice for most website installations. However, where security and privacy is more of a concern, businesses can turn towards cloud hosting in private clouds as an alternative – that is clouds which use ring-fenced resources (servers, networks etc), whether located on site or with the cloud provider.
A typical cloud hosting offering can deliver the following features and benefits:
HostForLIFE.eu is one of the most famous and best cloud hosting server out there to start a blog without spending a single extra penny. They have more than 2 million domains hosted on their servers. HostForLIFE.eu offers unlimited space, unlimited bandwidth. HostForLIFE claims 99.99% uptime with 24/7 technical support and 30 days back guarantee
CheapWindowsHosting.com | In this post I’ll show you how you can quickly and easily setup a fast Joomla! site running in Docker, and using Memcached for caching. We’ll also be using Docker Compose to make managing the relationships between containers easier. Docker Compose makes the task of managing multiple Docker containers much easier than doing them individually. It also makes it easier if you want to develop this site on your local environment and later push it to a remote server with docker — although that’s a topic for another tutorial.
If you haven’t installed Docker on your machine yet, follow these instructions. The instructions for Ubuntu 14.04 (LTS) are below. Be sure to run them as root! If you already have Docker skip on to installing Docker Compose if you don’t already have it.
Install Docker on Ubuntu 14.04 LTS apt-get update apt-get install apt-transport-https ca-certificates apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D echo "deb https://apt.dockerproject.org/repo ubuntu-trusty main" >> /etc/apt/sources.list.d/docker.list apt-get update apt-get install linux-image-extra-$(uname -r) apparmor docker-engine service docker start
Next install docker compose:
curl -L https://github.com/docker/compose/releases/download/1.6.2/docker-compose-`uname -s`-`uname -m` >> /usr/local/bin/docker-compose chmod +x /usr/local/bin/docker-compose docker-compose -v
You should see docker-compose output its version information. Something like: docker-compose version 1.6.2, build 4d72027
.
Next up we need to create the docker compose files and get the Joomla files ready to use:
Create Project Directory
mkdir -p ./mysite/website_source ./mysite/mysql cd mysite
Next create a Dockerfile
(e.g. vim Dockerfile
) that tells Docker to base your webserver code off of a PHP5 image, and include your source code.
Dockerfile FROM php:5-fpm # Install dependencies RUN apt-get update -y && apt-get install -y php5-json php5-xmlrpc libxml2-dev php5-common zlib1g-dev libmemcached-dev # Install mysqli extension RUN docker-php-ext-install mysqli # Install memcache extension RUN pecl install memcache && docker-php-ext-enable memcache # Add sources to image ADD ./website_source /site
And then create the docker-compose.yml
file (e.g. vim docker-compose.yml
). The docker-compose file lays out how your website is structured. It says that your site is composed of three services: web
which is an Apache/PHP image with your source code baked in, db
which is the MySQL database, and cache
which is the memcached image. It also tells docker how to connect these docker containers so that they can communicate with each other. Lastly, it tells docker to bind port 80 to the web
container. Lastly, the mysql container will mount the mysql
directory on the host and place the database files there. This way if the container is removed or anything you don’t lose your database.
docker-compose.yml version: '2' services: web: build: . command: php -S 0.0.0.0:80 -t /site ports: - "80:80" depends_on: - db - cache volumes: - ./website_source:/site db: image: orchardup/mysql environment: MYSQL_DATABASE: joomla MYSQL_ROOT_PASSWORD: my_secret_pass volumes: - ./mysql:/var/lib/mysql cache: image: memcached
By sure to replace my_secret_pass
with a secure password for the mysql user!
Now that you have a Dockerfile
and docker-compose.yml
you just need to get the sources for Joomla and install them:
Download and Install Joomla! wget https://github.com/joomla/joomla-cms/releases/download/3.4.8/Joomla_3.4.8-Stable-Full_Package.zip unzip Joomla*.zip -d ./website_source mv ./website_source/htaccess.txt ./website_source/.htaccess mv ./website_source/robots.txt.dist ./website_source/robots.txt
Note that if you don’t have unzip
installed you can install it by running apt-get install unzip
.
Now that you have everything setup its time to test everything by building and running the docker containers. This is accomplished with docker-compose
:
docker-compose build docker-compose up
This will run the whole application in the foreground of your terminal. Go to http://localhost:80
and complete the Joomla Installer! You’ll use the mysql username and password you specified in your docker-compose.yml file. The mysql host is also specified in the docker-compose.yml
file as the name of the database service. In our case, this is db
. Once you’re finished you can use CTRL+C
to stop the containers.
Now that your Joomla site is running under docker it’s time to connect it to the memcached server to make sure that things stay speedy!
To enable memcached edit
website_sources/configuration.php
and replace
public $caching = '0'; public $cache_handler = 'file';
with this
public $caching = '2'; public $cache_handler = 'memcache'; public $memcache_server_host = 'cache'; public $memcache_server_port = '11211'
Add your changes to the container image with docker-compose build
and then run docker-compose up
, log into the Joomla administration page and go to “Global Configuration” -> “System”. You can tweak the settings under “Cache Settings” or leave them as they are.
The last step in setting up a web application with docker is to have the web server started when the server starts.
Create the file /etc/init/mydockersite.conf
with the contents:
/etc/init/mydockersite.conf description "Website Docker Compose" author "MichaelBlouin" start on filesystem and started docker stop on runlevel [!2345] respawn script /usr/local/bin/docker-compose -f /var/www/mysite/docker-compose.yml up end script
Be sure to replace /var/www/mysite/docker-compose.yml
with the full path to your docker-compose.yml
!
Save the file and run the following to register the service, and to start it:
initctl reload-configuration service mydockersite start
And there you go! You can view logs for your service by running docker-compose logs
while in the same directory as your docker-compose.yml
or by reading the logfile at /var/log/upstart/mydockersite.log
.
CheapWindowsHosting.com | Best and cheap ASP.NET Core hosting. Today I upgrade my ASP.NET Core application from version 1.0 to version 1.0.1 because of a bug in Entity Framework. Right after updating to the latest ASP.NET Core version, I built the project but ended up with the following error in Visual Studio:
Can not find runtime target for framework '.NETFramework,Version=v4.5.1' compatible with one of the target runtimes: 'win10-x64, win81-x64, win8-x64, win7-x64'. Possible causes: The project has not been restored or restore failed - run dotnet restore You may be trying to publish a library, which is not supported. Use dotnet pack to distribute libraries
After searching around for a few minutes I found issue #2442 on GitHub. This the issue states that you need to update your project.json
and you have two options:
(1). Include the platforms you want to build for explicitly:
"runtimes": { "win10-x64": {}, "win8-x64": {} },
(2). Update the reference Microsoft.NETCore.App
to include the type
as platform
:
"Microsoft.NETCore.App": { "version": "1.0.1", "type": "platform" }
For more information on .NET Core Application Deployment you can read the docs. This is again another reason I love that all this work is being done out in the open. It really makes finding issues and bugs of this type easier. Hope that Helps!
[su_button url=”http://asphostportal.com/ASPNET-5-Hosting.aspx” style=”3d” background=”#ef362d” size=”6″ text_shadow=”0px 0px 0px #ffffff”]Best OFFER Cheap ASP.NET 5 Hosting ! Click Here[/su_button]
CheapWindowsHosting.com | Best and cheap Let’s encrypt hosting. An SSL certificate provides an encrypted connection between the server and the visitor’s browser, and is essential for an e-commerce site (such as powered by WordPress + WooCommerce) in order to protect sensitive customer data and backend administration.
If you don’t have an e-commerce website, it can be difficult to justify the cost of an SSL certificate. However, as Google now rewards sites with an SSL with higher rankings, websites owners are having to weigh up the cost with the potential benefits.
Still unsure if you should use SSL, even if it’s free? Here are three reasons to install a Let’s Encrypt SSL certificate:
Let’s Encrypt is a free, automated, and open certificate authority based on the principles of co-operation, transparency and public benefit. Let’s Encrypt is sponsored by a number of very well known companies for example Google Chrome, Facebook and Automattic (the company behind WordPress)
Let’s Encrypt is provided with the WP Plus plan for all existing and future Create Hosting customers. Currently it can only be setup by an administrator. We can enable this during setup, or at any time – just submit a support ticket and we’ll take care of it for you.
[su_note note_color=”#d38139″ text_color=”#ffffff”]
Good news: Plesk and Let’s Encrypt now make it possible for WordPress hosting customers on our WP Plus plan to use an SSL on their website completely free of charge. The Let’s Encrypt SSL needs to be renewed every 90 days, but Plesk takes care of this, automating the SSL re-issuance process every 30 days behind the scenes.
[/su_note]
CheapWindowsHosting.com | Best and cheap Git hosting. In this post I will explains How To Integrate Plesk Onyx with Git.
Plesk allows you to integrate with Git – the most popular source code management system used by many web developers. You can manage Git repositories and automatically deploy web sites from such repositories to a target public directory. In other words, you can use Git as a transport for initial publishing and further updates.
[su_note note_color=”#d38139″ text_color=”#ffffff”] Note: This functionality is not supported in Plesk installations running on Windows Server 2008. [/su_note]
In Plesk, you can add Git repositories of two types depending on the usage scenario:
When you have Git repositories enabled in your domain, the list of created repositories is displayed on the domain’s page. For each repository, the name, the current branch and the deployment path are displayed. The Deploy button near the repository name allows you to deploy the files from a repository (if manual deployment is configured) and the Pull Updates button allows you to pull the changes from the remote repository.
The Git link allows you to manage the domain’s Git repositories.
Founded in 2008, it is a fast growing web hosting company operated in New York, NY, US, offering the comprehensive web hosting solutions on Git Hosting and they have a brilliant reputation in the Git development community for their budget and developer-friendly hosting which supports almost all the latest cutting-edge Microsoft technology. ASPHostPortal have various shared hosting plan which start from Host Intro until Host Seven. But, there are only 4 favorite plans which start from Host One, Host Two, Host Three, and Host Four. Host One plan start with $5.00/month. Host Two start with $9.00/month, Host Three is the most favorite plan start from $14.00/month and Host Four start with $23.00/month. All of their hosting plan allows user host unlimited domains, unlimited email accounts, at least 1 MSSQL and 1 MySQL database. ASPHostPortal is the best Git Hosting, check further information at http://www.asphostportal.com
CheapWindowsHosting.com | Best and cheap Magento Hosting. In this post I will expains more about magento. As a digital agency, we work with Magento every day on both development and search engine optimisation projects. If you’ve used Magento in the past, you know it’s a huge system with lots of menus, drop down options and settings all over the place.
Optimising Magento for search is quite straightforward once you know how to do it, so I put together an easy-to-follow guide that everyone can use to make the process easier to learn. This guide is based around the Magento Community Edition.
For this tutorial, I’m going to assume you have a basic knowledge of SEO, but I’ll also point out selections along the way if you want to read more about specific aspects of SEO that I refer to.
I’m not going to cover general page layout, heading tags or the actual content you should write. This is a basic article to get the core configuration of Magento correctly setup, and to help people out with some of the most common questions we’re asked with regards to SEO and Magento.
So, log in to your Magento store’s admin panel and let’s get stuck in.
Most stores that are live will have already carried out a few of these steps. That’s okay, since we are covering the basics, and want to make sure we cover all the bases. Let’s start from the top of the System > Configuration page and work our way down.
Go to System > Configuration > Design > HTML Head. In here, you’ll see the basic fallback settings that Magento has that you can use for SEO purposes. If you haven’t already setup a Favicon, then do that first. It doesn’t affect your SEO, but the standard Magento one doesn’t look great.
The defaults you want to ensure are set here are Default Title, Default Description, and Default Robots.
Recommended: We usually fill in the Title Suffix as well with our clients’ brand name. For example, we might put – Pinpoint Designs into the Title Suffix field. This will then be appended to each title tag.
Since the above options are only fallbacks, I would normally recommend putting your company name in as the Default Title, and using a description of your company for the Default Description. It’s very important that your Default Robots is set to INDEX, FOLLOW if your store is live. For a development store, you should switch this to NOINDEX, NOFOLLOW. (Remember to swap it back when you go live, or search engines may choose to ignore your website.)
Note: While Meta Keywords are not used by many search engines anymore, Magento will roll back to your product names if these aren’t set. For Default Keywords, you can enter your store name as the fallback.
If you’re looking for advice on Meta Titles and Meta Descriptions, take a look at the Moz guides that I’ve linked to here.
Moving on, one of the easiest changes you can make to Magento is to prevent the index.php string from appearing in your main URL. At the same time you change this, you can also force Magento to the www. or non-www. version of your website to avoid duplicates.
To carry out these changes, go to System > Configuration > Web. In here, you’ll see a list of different sections that you can open. We want to open both the URL Options and Search Engine Optimisation sections.
Now set Auto-Redirect to Base URL to Yes (301 Moved Permanently) to automatically get Magento to redirect to your base URL. (So if your base url is http://www.yourdomain.com, it will redirect to the www. version of your website from now on.)
Next, set Use Web Server Rewrites to Yes in order to remove the index.php string from your base URL.
Note: The above changes may not work depending on your server configuration. If in doubt, contact your web hosting provider for assistance.
In order to get the search engines to only recognise one version, we should enable canonical URLs. To do this, go to System > Configuration > Catalog and choose the Search Engine Optimizations dropdown option. There are quite a few options that we can set in here. I’ll explain them very quickly:
Once you’ve updated these settings, it’s important to reindex the data on your website. To do this, go to System > Index Management. Click Select All and then Reindex Data using the mass action drop down in the top right hand corner of the page.
The easiest way for a search engine to crawl your website is via a sitemap submitted to Google Webmaster Tools, Bing Webmaster tools, Yahoo Site Explorer, etc. As you would expect, Magento will keep your sitemap up to date and generate this for you automatically. In order to enable this, go to System > Configuration > Google Sitemap (under the Catalog heading).
In here, we can configure the priority of each of our pages, along with how often they’re updated and how often we want the sitemap to be updated. This section is a little hard to explain in a tutorial, as it completely depends on your type of store and what you’re primarily optimising.
For the purpose of this article, we’re going to assume your category pages are the most important pages, as these house all of your products and should be optimised for more general terms. We’d next prioritise product pages, as these are specific pages that you want people to hit if they’re looking for a particular item. Finally, we’d have our CMS pages. These are pages that cover information such as terms and conditions, your privacy policy, and shipping information, so they’re generally lower priority. Your homepage also comes under the CMS pages heading.
So, using the above as an example, we’d select the priority and frequency as follows:
Category Options: Frequency set to Daily; Priority set to 1.
Product Options: Frequency set to Daily; Priority set to 0.5.
CMS Page Options: Frequency set to Weekly; Priority set to 0.25.
With the above, if your product catalog and categories don’t change very often, you could drop the frequency down to weekly, but this isn’t necessary.
Note: For the Generation Settings to work, you will need to make sure your Magento cron works correctly.
Next, we need to generate the actual sitemap file. To do this, go to Catalog > Google Sitemap and click on Add Sitemap Button in the Top Right. Then give your sitemap a name, and put a forward slash in the path file to get it to save in the root directory.
Once done, click Save & Generate and your sitemap should be viewable at yourdomain.com/sitemap.xml.
Assuming it all worked correctly, head over to Google, Bing and Yahoo and submit the sitemap URL you’ve just generated. We’ll add it to the Robots.txt file later.
Additional Notes: If you’re running multiple stores from the same Magento installation, you might want to separate your sitemaps. So using the example of an English and Spanish store, you might call one sitemap-en.xml and the other sitemap-es.xml. You might also want to put these into a subdirectory. If you do this, you will need to make sure that the folder has CHMOD permissions to write. CHMOD 755 should be fine, but you may need to change this to 775 on certain setups. Never set your CHMOD permissions to 777. If in doubt, ask your hosting provider.
I’m not going to go into huge detail on the Robots.txt file as there’s a fantastic guide written by Inchoo with example templates and different versions explained. Take a look at it and make a judgement call on which Robots.txt file will do the best job for you. You can then modify it to suit your store’s particular requirements.
Remember to update the sitemap URL with the one we just generated (above). This will allow other search engines to pick up your sitemap without the need to submit to them all.
On the above guide, I would strongly recommend using the Inchoo Robots.txt file. That said, it’s important to check everything over before you add it to your store.
Adding your Google Analytics tracking code to Magento is very straightforward. Head over to http://analytics.google.com and log into your account. Make sure that you have eCommerce tracking turned on. (This can be done by going Admin and clicking on the Ecommerce Settings option which appears under the View heading on the right.)
Once you’ve done this, head over to System > Configuration > Google API to enable the module and check your UA- Tracking Number. Click Save and you’re done.
Alternative Solution – I would recommend installing the Fooman Google Analytics + module, which is free from the Magento extensions store. This allows you to track AdWords conversions, secondary profiles, dynamic remarketing and more within Magento. If you’re unsure of how to install modules, ask your web developers, or follow this guide. Once installed, go to System > Configuration > Google API and open up the option for GoogleAnalyticsPlus by Fooman. Fooman offers a full guide on how to set this module up, and it’s much better than the standard Magento tracking.
Finally, let’s take a look at page optimisation. This is a fairly simple section of Magento where it’s really down to you to come up with some brilliant content and make sure your pages are optimised properly for the search engines. We’ll split this into three sections: CMS Pages, Category Pages, and Product Pages.
Key Things to Remember About All the Above Pages
I hope this article has been helpful. Depending on the response, I may do a follow up article that explains the more advanced sections of Magento.
Magento is a very powerful system that is easily scalable, and I work with our clients at Pinpoint Designs worldwide to build and promote their stores with it. So if you have any questions regarding Magento, post a comment below and I’ll respond as soon as possible.
CheapWindowsHosting.com | Best and cheap docker hosting. In this post we will explains everything about docker.
Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package. By doing so, thanks to the container, the developer can rest assured that the application will run on any other Linux machine regardless of any customized settings that machine might have that could differ from the machine used for writing and testing the code.
In a way, Docker is a bit like a virtual machine. But unlike a virtual machine, rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.
And importantly, Docker is open source. This means that anyone can contribute to Docker and extend it to meet their own needs if they need additional features that aren’t available out of the box.
Docker is a tool that is designed to benefit both developers and system administrators, making it a part of many DevOps (developers + operations) toolchains. For developers, it means that they can focus on writing code without worrying about the system that it will ultimately be running on. It also allows them to get a head start by using one of thousands of programs already designed to run in a Docker container as a part of their application. For operations staff, Docker gives flexibility and potentially reduces the number of systems needed because of its small footprint and lower overhead.
Docker brings security to applications running in a shared environment, but containers by themselves are not an alternative to taking proper security measures.
Dan Walsh, a computer security leader best known for his work on SELinux, gives his perspective on the importance of making sure Docker containers are secure. He also provides a detailed breakdown of security features currently within Docker, and how they function.
A number of companies and organizations are coming together to bring Docker to desktop applications, a feat that could have wide-ranging impacts on end-users. Microsoft is even jumping on board by bringing Docker to their Azure platform, a development that could potentially make integration of Linux applications with Microsoft products easier than ever before.
Docker 1.0 was released on June 9th, during the first day of Dockercon, and it is considered the first release of Docker stable enough for enterprise use. Along with this launch, a new partnership was announced between Docker and the companies behind libcontainer, creating a unified effort toward making libcontainers the default standard for Linux-based containers. The growth of Docker and Linux containers shows no sign of slowing, and with new businesses jumping on the bandwagon on a regular basis, I expect to see a wealth of new developments over the coming year.