It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by winofiend
I have no idea what that means, lol, I was never any good at grasping the networking side of things.
If I plug a blue cable in and I get a green light, I'm happy.
So.. what does it really mean? Is it just a software based routing system? Because my limited knowledge remembers back in 99, using software based routers, set up in linux.
I'm guessing it's nothing like that. But that's all I can fathom - a more efficient way to manage traffic in a dynamic manner.
I dunno, if it means that I can play online games where the server is in a different country and not have 400ms pings, then I'm all for it. But as far as I know that's a limitation on the sealink from Australia to the rest of the world.
I used to tracert to, say google.com and every hop would be less than 30ms until we hit the telstra link out, then greater than 350ms.. so I dunno how that will affect us if I'm even close to getting part of the little bits I think I might know.
edit on 21-4-2013 by winofiend because: (no reason given)
Originally posted by kwakakev
I did not notice, but has been some internet traffic problems the past couple of weeks. So as a website administrator what does this mean to me?
Are there any specific languages or software development approaches that I need to be aware of to make the most of this new development?
How am I suppose to structure database development and administration to work in a multiple server environment?
Are there any certain standards that I need to be made aware of as a developer or is this just purely the web hosting responsibility?
Do I just make my single MYSQL database and all the virtualization is done under the hood, is there any specific software versions I need to be careful of?
Originally posted by Maxatoria
Once my sinuses decide to play ball and not go boom boom i'll have a proper look at what they're doing
for the average person there will be no change as the data presented to the application will be the same but i'd imagine that by taking control more away from static routing and older networking protocols and being able adjust more on the fly and take advantages of better ways of doing things that are not implemented in classical routers/switches
Originally posted by CarbonBase
I see. Someone did a software upgrade and there was no loss of service. No one noticed? Could that be like migrating 750,000 people who are talking on a telephone, from one telephone switch to another while they were on a call, and nothing happened. No dropped calls. Piece of cake. Yeah, technology. It's like that.
well in the case of two or more servers one DB and one DC you would set up a server to be the control plane "controller" with open flow running on it, and open flow would "manage" the network communications over the network, and to your users.
Originally posted by Maxatoria
reply to post by XPLodER
Why should anyone notice what happens between googles data centres as its their internal traffic and if they had even half a brain cell they'd still have the older gear up and running plus alternative routes between DC's so should any of the upgrades fail everything would be business as usual and all people may notice at worst is an extra 0.5 seconds before their videos started to play but i would of thought that each link would be brought up and running and just routing special internal load test data for a few weeks before the day of switch over to ensure nothing goes wrong
Originally posted by kwakakev
reply to post by XPLodER
As an emerging website developer there are some trends to be careful of. I am aware of some google services that do provide the same scalability as google in general, but comes attached with its privacy implications. I general I like the philosophy of if 'you cannot be honest about it you should not do it', but also aware that our reality is a bit more complicated than that.
The DDOS threat has made me quite wary and tired as a developer.
Having one abstracted point of access as a developer does alleviate a lot of strain with database integration.
I very much like the PHP platform as a server based script being more socially secure to the local based JavaScript approaches.
Going through the database channel of stored procedures is the most secure as only variables and not code is transmitted. There is also the potential for optimizations with only predefined and known procedures allowed. It does appear that some corporate developers are having major issues with feature creep in this realm.
Where I am currently stuck as a developer is 'how to secure a session'. Personally I would like to build a strong under current before attaching a SSL layer. Even if SSL is cracked, there is a whole new layer of coding for the attackers to contend with. At the moment, PHP and SQL is what I trust as an individual, group and whole.
Originally posted by Opportunia
reply to post by XPLodER
This was not done from someone simply making an innovative decision. The reason they did it was to avoid paying Vringo usage rights after being sued by them for patent violations.
while there are aspects i cant talk about you can ask me direct questions if you want direct answers.
with a SDN telco network (ISP), the packets can be "dropped" at the network edge, at the telco level, or at the individual network level.
can be pro-actively programmed for fail-over, or load balancing as well,
that is interesting, using stored procedures to provide secure commands?
Originally posted by XPLodER
there are protocols like the loss avoidance algorithms that can speed up delivery of packet data over this transport to make gaming a lag free experience.
you may get lag free (as in smooth) playing experience but will add an offset to everything.
Originally posted by XPLodER
Originally posted by kwakakev
I did not notice, but has been some internet traffic problems the past couple of weeks. So as a website administrator what does this mean to me?
your hosting provider will in their next upgrade cycle be made aware of SDN or software defined networking,
you as an administrator may be interested in the extra functionality of being able to use the load balancing nature of SDN to provide better service, mitigate DDOS attacks, from a band width point of view, you can save by having a dynamic allocation of bandwidth. biggest change for an administrator is you dont have to manually configure your switches routers ect. in theory you should be able to serve more users over the same network/server infrastructure.