On 16 December 2011 18:30, Darrell Anderson <humanreadable(a)yahoo.com> wrote:
Wrong wrong
wrong. Cookies are great when used to store
information I wanted stored, like preferences or to stay
logged into a website, tracking cookies are obviously bad
however.
Let me rephrase. When a web site refuses to load because cookies are not
enabled, then the web site designer is playing the role of a fool. I came
across such a site today (Sears), researching a product. Because I use a
white list approach for cookies, I do not have cookies enabled globally.
All I received from the web site was a message that I did not have cookies
enabled. No product information. I closed the page and moved on. I'm not
dealing with idiots like that.
Cookies are fine when the END-USER finds them useful. When cookies are
required to merely use a web site (like etherpad) then there is something
inherently wrong with the design.
The same goes for JavaScript. For example, visit the Asus downloads site.
No meaningful function at all unless JavaScript is enabled. The Amazon web
site is another example in that viewing alternate images is not possible
without JavaScript. That's not good design.
Amazon.com caters to the 99.999999999999% of users who have javascript
enabled. Frankly as a web developer, I'd be insane to have to assume
everyone had it turned off.
Javascript is
not a issue for bandwidth, it won't
present an issue since it is interpreted locally. The only
functionality I am suggesting not exist is a commenting
system - which is non-essential.
JavaScript IS a bandwidth issue. Try surfing the web all day on dial-up.
Although today I have a broadband connection, the connection is not
high-end or robust. I still very much experience the
overhead of all the
web 2.0 nonsense.
And what is wrong with a simple forms-based comment section, much like we
saw in the pre Web 2.0 days? Nothing wrong. People get mad when they
configure a JavaScript white list in their browser and visit a site where
they have to temporarily enable JavaScript.
This will be considered also as an option.
Scripts are
just small text files, overhead is as minimal
as the already existing CSS files. (or do you have that
disabled too?) Javascript is 15+ years old. Stop being
absurd! Are you running machines that are pre-pentium 1? Of
course we won't be rendering WebGL 3D frames, we'll
be using very simplistic javascript to achieve a minimal
end. In fact most websites these days depend on it.
I'm being realistic. We have two different perspectives. I was using
computers back before there were BBSs. Back when there was no world wide
web. All we had were modems. I remember when sites had to be designed to be
efficient, which seldom happens anymore. Until a few years ago I had to
survive on dial-up. I have not forgotten those days and I will not forget
people still using dial-up or low broadband connections.
You sound like an old fogie who is concerned that rock and roll is from the
devil. Again our website is very minimalist and will continue to do so,
Javascript or not.
I see TDE as being a good desktop for people using older hardware. People
using older hardware are unlikely to have fast
connectivitiy.
I don't think there is a direct connection between the two. Then again
people using linux are more likely to know about computers and want a
faster connection.
That you allege many web sites today depend upon JavaScript is about the
same as saying everybody else is jumping off the cliff
so let's do that
too. I never have been much of a "jump on the bandwagon" person. :)
Sink or swim, I'm diving in. Again, comments are a thing of Web 2.0 and
interactive webpages are as well. If we want these features then we need to
implement them in a modern way.
I concur with this as well. I think we
need to work on our top display bar, other than
that the
website works well. It renders well in a 640x640 window
My recommendations included a simpler nav bar.
Yep duely noted.
Calvin Morrison