27 February, 2013

What is Hosting ?

Hosting is like renting an office. You can do it yourself out of your garage, or you can pay for better facilities with all the essential plumbing and security features. For an office this would mean renting at a business center. For your website this means paying for space on a highly specialized computer server within a dedicated hosting facility. 

Just like an office directory listing, your website has a physical address that maps the exact location of the website within a hosted server. These physical addresses have their own 411 or yellow pages directory which is called Domain Name System (DNS). 

The DNS directory is like a GPS (Global Positioning System) for your website, and shows the route between your site's URL (www address) and the hosted servers location shown by their Internet Protocol (IP) address. The IP address is a unique number that identifies the physical location of a computer in a similar way that a postal code tracks a physical office location.

Types of Hosting


The high configuration computers that host your files are called web servers. In hosting facilities, these are connected to high-speed fiber optics with near 100% uptime. From your home or garage you may get a lower quality of connection than that typically available from companies dedicated to hosting.
Shared Hosting (Virtual Hosting): As high configuration servers are expensive to set up, it is common to share their space across many websites. Each website has their own web address and domain name. This economical solution suits businesses and individuals desiring an online presence with minimal involvement and budget.
There are some providers who offer free web space, typically with basic features for personal web pages and mostly supported by advertisements. There's no such thing as a 'free site'.
Reseller Hosting: Some service providers develop partnerships with professional web companies who become web hosts themselves by reselling the web space and rendering technical support themselves.
Dedicated Hosting: This is required for large sites or applications, which run within a dedicated space. Sites that need a higher level of security due to the nature of the information being accessed or supplied can take the extra step of having their site hosted on a Virtual Dedicated or Private Server (VPS) with additional security features.
Co-location Hosting: If you have your own server, ready with the applications, and just require continuous electrical supply and high-speed internet connection, you can locate it with some ISP's or professional web hosting companies which have the facilities.
 
 
There are a number of other hosting terms like cloud hosting, clustered hosting and grid hosting, providing web spaces from as small as 1MB to gigabytes. There is a whole industry dedicated to housing websites and web applications, and just as is in construction, getting the right design and connections for your location and needs are the most important components.
Support of Hosting
Along with each of these hosting services, a good provider typically supplies 24-hour support, daily backup of data, options for high web traffic volume, powerful bandwidth, e-mail capabilities, and specific database and applications access.
Treefrog Interactive local Toronto, Ontario hosting providers supply a high quality service that meets your needs. Contact us about hosting packages to suit your requirements.

23 February, 2013

What is new in Cloud Computing?

In this post, we’ll look at some of the trends in cloud computing and how cloud providers are reacting to address the needs of their customers.
 
Cloud providers catering to niche audiences:
Rather than focusing on just providing a generic cloud that is application agnostic, or an application that is customer agnostic, cloud providers are starting to focus on addressing the needs of specific customers.  They are realizing that generic clouds are not sufficient – a niche focus seeds quicker adoption. Providing customers generic CRM or Sales Force Automation software is not enough – after all these are complex applications that customers use and wield as a competitive advantage; so it doesn’t make sense to adopt a “one size fits all” approach to providing services for every customer. Customers will need the flexibility to extensively customize their applications and environments. Larger providers like Google and Amazon (AWS) are already addressing this need.

Proliferation of Private Clouds

Adoption of private and hybrid clouds are showing a steep rise since they address the consumer`s paranoia of security on public clouds. Larger organizations will typically use public clouds like AWS (Amazon web Services) and Rackspace only for non mission-critical applications, or as test-bench environments, but they will prefer to stay with private clouds for their mission critical applications and sensitive data needs. Smaller organizations will still continue to use these cloud providers for their mission-critical systems for obvious cost reasons.
Rapid adoption of the Cloud by SMBs
Small and Medium Businesses will rapidly move to the cloud as awareness and education by providers and early adopters increases. SMBs are quickly realizing that the cloud offers them the ability to scale up operations much faster than if they tried to manage their IT needs in-house. Cloud  providers are offering SMBs access to applications for everything from mail, office productivity, sales force automation to more specific applications like accounting, sales, and even business intelligence apps.
Disaster Aftermath – the Need for BCP 
After the tsunami in Japan, there has been an increase in awareness for the need for BCP (Business Continuity Planning) in SMBs. Something that was not a common practice before with SMBs, is now fast becoming a norm. As SMBs take a re-look at BCP, cloud solutions and cloud providers are becoming a corner stone for their BCP strategies. It is evidently simple for SMBs to understand that having their mission-critical systems on a public cloud infrastructure means that in the event of a disaster, their data is safe, their systems are up, and its business as usual for their customers. Interruptions are far less severe in a disaster situation and time to be back up and running servicing customers is almost as soon as employees can re-group and access an internet connection.
Increase in the number of Large Public and Hybrid Cloud Providers
Call it economies of scale, new revenue opportunities, or simply a new line of business – large organizations are looking to leverage their resources and expertise in building large cloud infrastructures for personal use as well as to act as a provider for other large, medium, and small businesses. The likes of Amazon (AWS), Saleforce.com (Force.com), Google, Microsoft, and IBM are some of the top providers of cloud services today; there is no reason why other companies with massive in-house information processing needs will not extend their platforms, services and know-how as cloud services to other organizations. We will see more of these in the coming months and years.
Addressing Data Location
One of the main banes of cloud computing as far as customers are concerned is that they have little knowledge of where the data is actually going to physically reside once they put it on the cloud.  It may come as a surprise to know that your data may not be residing in the same city, state or for that matter country as your organization.  While the cloud provider may be contractually obliged to you to ensure the privacy of your data, they may be even more obliged to abide by the laws of the state, and or country in which your data resides. So your organization’s rights may get marginalized.  Cloud providers are very cognizant of this fear among some of their potential customers – to allay these fears, providers are giving their customers the option to choose where their data is geographically located.  Larger providers like IBM are already providing services that are highlighting the fact that the data will be localized.  This trend will only continue in the months ahead.

How to get rid off pimples naturally without any medicine ?

Pimples are very common among young boys and girls due to harmonal changes. Due to pimples many black spots are formed on face which look very ugly. So I want some natural ways to cure pimples. Everybody wants to get riddoff pimples because he wants to look good. Pimples create hesitation while we are talking with other people .




I have written some natural ways in order to get rid of the pimple.


(1)Keep your face clean for all time by removing all the dirt. Frequently wash your face with a non-drying and mild soap that have skin-strengthening herbs and other natural ingredients. One should remove their make up before going to bed.

(2) If you try to rub your pimples to remove it then it will spread infection in your face. So do not rub, pinch, and knead your pimples.

(3)Always drink lots of water.

(4) You should not take too much oily and spicy food.
Keep yourself away from junk food.

(5)Make a habit of using mix of rose water and lemon juice on face, particularly on the pimples.

(6) Mint leaves and turmeric powders have remedial quality to remove pimples.

(7) Cucumber is also one of the best remedies for pimples.

(8)Using honey on the pimples for sometimes you can get cure from pimples.

(9)Meditation and yoga can help you to get rid of the pimples. Meditation and yoga help to regain hormonal, mental, physical and emotional balance.

(10) Make a perfect diet and avoid tea, coffee, meats, ice cream, creams etc.

Gene sequencing yields breakthrough for children with rare Parkinson’s-like disorder

Doctors can now use a person’s genetic sequence as the basis for rational drug selection—a sign of how far personalized genomics has come in recent years. A case report published today in the New England Journal of Medicine illustrates the strength of this approach.
The paper describes an extended Saudi Arabian family in which many young siblings suffered from a Parkinson’s-like condition affecting their movement. The children had normal levels of neurotransmitters dopamine and serotonin in their spinal fluid, suggesting they should have been healthy. The unique circumstances prompted researchers to use the latest advances in genomic sequencing to identify a mutation in the SLC18A2 gene, which encodes the protein vesicular monoamine transporter 2, or VMAT2, as the cause of the disease.
A team led by Berge Minassian, a neurologist at the Hospital for Sick Children in Toronto, successfully pinpointed the mutation and treated the symptoms in these siblings. I am certain that in the next few years patients walking into children’s hospitals will have their whole genomes sequenced,” says Minassian. Until now, magnetic resonance imaging (MRI) has been the primary diagnostic tool for people with neurological diseases.
The study’s initial patient was a 16-year-old girl first diagnosed with muscle weakness when she was just four months old. She sat for the first time when she was two and a half years old, began crawling at four and walking—and only with difficulty—at the very late age of 13. Her symptoms resembled Parkinson’s disease, but all her metabolic and MRI tests came back normal. Doctors also ran tests on her 2-year-old sister who suffered from similar symptoms and a red flag showed up in the toddler’s urine, where dopamine levels were below average. The physicians then gave the 16-year-old and her three younger siblings levodopa-carbidopa, a dopamine precursor used to treat Parkinson’s. They were puzzled, though, when the conditions worsened in all four.

Injectable gel repairs damage after heart attack in pigs

As you read this sentence, on average at least one person in the US will have started to clutch her chest. The blood flow to her heart will become blocked and cardiac muscle cells will start to die off and get replaced with scar tissue. This person has just suffered a heart attack and most likely will go on to develop heart failure, a weakening of the heart’s ability to pump blood and oxygen. In five years time, there’s a 50/50 chance she’ll be dead.
                         
There are currently no treatments that can repair the damage associated with this so-called ‘myocardial infarction’ (MI), but a potential solution is now showing promise in a large-animal model. Reportingtoday in Science Translational Medicine, a team of bioengineers at the University of California–San Diego (UCSD) has developed a protein-rich gel that appears to help repair cardiac muscle in a pig model of MI.
The researchers delivered the hydrogel via a catheter directly into the damaged regions of the porcine heart, and showed that the product promoted cellular regeneration and improved cardiac function after a heart attack. Compared to placebo-treated animals, the pigs that received a hydrogel injection displayed a 30% increase in heart volume, a 20% improvement in heart wall movement and a 10% reduction in the amount of scar tissue scar three months out from their heart attacks. “We hope this will be a game-changing technology that can actually prevent heart failure after heart attack,” says UCSD’s Karen Christman, who led the study.
Christman and her team developed their hydrogel by stripping muscle cells from pig hearts, leaving behind a network of proteins that naturally self-assembles into a porous and fibrous scaffold upon injection into heart tissue. They previously tested its safety and efficacy in rats, where they found increased cardiac function and no toxicity or cross-species reactivity.
Similar strategies using naturally-derived scaffolding, such as small intestinal submucosa from pigs in wound patching, are well established. The UCSD study now shows the clinical potential of this approach for cardiac regeneration after a heart attack in a large animal that more approximates humans. Christman has already formed a company based on the technology, called Ventrix, and she hopes to move the product into human safety trials within the year.
Jeffrey Karp, a bioengineer at the Brigham and Women’s Hospital in Boston who is working on a glue that can bind cardiac tissue in live rat and pig hearts (as reported in a news feature this month in Nature Medicine), believes this is promising technology. “Promoting regeneration following myocardial infarction is one of the holy grails in medicine,” he says.
But, Karp warns, “it will be important to validate these results in additional pre-clinical studies, and compare efficacy with other approaches prior to marching onward to the clinic.”