Coding: Fleeting Thoughts

A place to discuss the implementation and style of computer programs.

Moderators: phlip, Moderators General, Prelates

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Coding: Fleeting Thoughts

Postby 0xBADFEED » Sat Jul 31, 2010 6:52 pm UTC

TNorthover wrote:My main point with unicode was that it, if anything, canonically says what is in a string and it has a code point for 0. What happens with various encodings wasn't even on my mind.
(Incidentally, I think you can put any unicode character except 0 into a utf8-encoded, null-terminated string. Most languages' strings don't have that limitation, naturally).

Ah, OK. I misinterpreted your point. You're saying that any implementation that can't allow U+0000 embedded in the string is not a faithful implementation of the standard.

User avatar
hotaru
Posts: 1045
Joined: Fri Apr 13, 2007 6:54 pm UTC

Re: Coding: Fleeting Thoughts

Postby hotaru » Sat Jul 31, 2010 6:54 pm UTC

0xBADFEED wrote:Although, if someone's putting UTF-16 or UTF-32 into byte strings and trying to manipulate them as NTBS's then hope for them was lost long ago.

FTFY.

TNorthover wrote:My main point with unicode was that it, if anything, canonically says what is in a string and it has a code point for 0.

this is what unicode says about what is in a string:
http://www.unicode.org/versions/Unicode5.2.0/ch03.pdf wrote:In the context of programming languages, the value of a string data type basically consists of a code unit sequence. Informally, a code unit sequence is itself just referred to as a string, and a byte sequence is referred to as a byte string. Care must be taken in making this terminological equivalence, however, because the formally defined concept of a string may have additional requirements or complications in programming languages. For example, a string is defined as a pointer to char in the C language and is conventionally terminated with a NULL character. In object-oriented languages, a string is a complex object, with associated methods, and its value may or may not consist of merely a code unit sequence.

Code: Select all

factorial product enumFromTo 1
isPrime n 
factorial (1) `mod== 1

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Coding: Fleeting Thoughts

Postby Xanthir » Sat Jul 31, 2010 7:38 pm UTC

TNorthover wrote:(Incidentally, I think you can put any unicode character except 0 into a utf8-encoded, null-terminated string. Most languages' strings don't have that limitation, naturally).

Correct. utf8-encoded strings never have a 0 byte unless you're specifically encoding U+0000.

(That, btw, is precisely why HTML5 requires parsers to normalize U+0000 to U+FFFD - it helps tools that are downstream and aren't designed properly.)
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Sun Aug 01, 2010 1:20 am UTC

0xBADFEED wrote:
PM 2Ring wrote:
You, sir, name? wrote:Well, I don't know how C# does things, but I'd have thought signaling end-of-string with \0 was the convention.

It's a C convention. Prepending the string with its length is an earlier convention. It was fairly standard in BASIC, and Pascal.

Sure, you'd expect C# to follow C conventions, but you ought to know by now what Microsoft's attitude is to conventions. :)

Strings in BCPL are rather scary: the length of the string is at a negative offset to the head of the string, i.e. s[0] is the start of the character data, and s[-1] contains the string length. (Note that those indices are word-based, not byte based.)

Honestly, Pascal-style is the better convention. I don't think it had anything to do with MS's attitude and it was purely a technical choice where they went with the better option. It makes a lot more sense when you think about wanting to read in the binary file you just wrote.

OK, so I have to read a string now. How much buffer should I allocate? With length-prefixed strings you know. With NTBS's you have to read/rewind/read or grow the buffer incrementally, both of which are less performant and more complicated.


I've used BASIC/Pascal-style strings a lot, and tend to agree that they are superior to C-style strings, which don't have many advantages at all, apart from their simplicity & the saving of a byte or 3. FWIW, I've converted C-style strings to Pascal-style strings in C programs from time to time (or used a simple struct to accomplish the same thing), although I'm more likely to pack them into one big array & build an index array. Having to compute the length of a given string more than once has always seemed very inefficient to me.

My dig at MS was mostly in regards to their doing non-C-like things in a language named after C.

I just realized I made I mistake in my remark about BCPL strings: the size is at offset 0. However, I'm sure I have used strings in some environment where the size was at a negative offset to the string handle, but I can't recall where it was ATM. Actually, that system's not too bad, if you pay attention to what you're doing.

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Sun Aug 01, 2010 10:46 am UTC

So I was compressing some 500 mb large postscript files down to about 1.5mb using winrar, which was all well and good except those large files were just intermediary files that I didn't want to store on a harddrive. When I switched to compressing from standard input, however, the compressed files expanded to about 100mb in size, which is much too large. The internet won't give me a good explanation for this.

Maybe I could use gzip...
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Sun Aug 01, 2010 10:54 am UTC

Berengal wrote:So I was compressing some 500 mb large postscript files down to about 1.5mb using winrar, which was all well and good except those large files were just intermediary files that I didn't want to store on a harddrive. When I switched to compressing from standard input, however, the compressed files expanded to about 100mb in size, which is much too large. The internet won't give me a good explanation for this.

Maybe I could use gzip...

That's a pretty huge compression ratio. I guess it could be possible, but did you test those 1.5mb files to make sure they actually worked?

I'd expect file compression to be (potentially) more compact than stream compression, since the whole file can be analyzed to generate an optimal encoding table. But I know nothing about RAR, and haven't played around with the details of compression algorithms for more than a decade.

FWIW, PostScript has a few compression filters built in, but they don't seem to get that much use, IME (not that I make a habit of reading machine-generated PostScript files :) ).

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Sun Aug 01, 2010 11:36 am UTC

Yes, they work. PostScript itself is highly redundant being a programming language and all, and in this case the large files are just a concatenation of thousands of smaller files, each with the same layout which is also very repetitive. The actual data content of each file is very small. I wouldn't have been too surprised if we achieved a 1:2000 compression ratio.
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Sun Aug 01, 2010 12:42 pm UTC

Berengal wrote:Yes, they work.

Cool. IME, it's always a good idea to check these things when odd things happen. :)

Berengal wrote:PostScript itself is highly redundant being a programming language and all, and in this case the large files are just a concatenation of thousands of smaller files, each with the same layout which is also very repetitive. The actual data content of each file is very small. I wouldn't have been too surprised if we achieved a 1:2000 compression ratio.

And if all the concatenated images have the same large identical prolog, there's plenty of scope for compression. If that's the case, it certainly explains the size difference between compressing a disk file v compressing stdin.

EDIT
And if that is the case, it may be worthwhile to whip up a little utility that will scan the files & remove the redundant copies of the prologs. Or to remove them all, and replace them with a customized prolog. With a customized prolog you could even save more space by adding a procedure or two that create the repetitive layout more efficiently than the machine-generated prologs.


I enjoy writing in PostScript, mostly because I like RPN, but also because I think the original core of PS was very well designed. I'm not so keen on some of the latter additions & extensions, though.

I sometimes do silly things in PS, like programs that calculate & print e to large numbers of decimal places, or self-formatting tables of primes. Or replicating old-fashioned logarithm tables. Mandelbrot generators in PS are not so practical, but it's excellent for algorithm-heavy vector graphics, and things like logarithmic spirals of text.
Last edited by PM 2Ring on Sun Aug 01, 2010 1:19 pm UTC, edited 1 time in total.

User avatar
TNorthover
Posts: 191
Joined: Wed May 06, 2009 7:11 am UTC
Location: Cambridge, UK

Re: Coding: Fleeting Thoughts

Postby TNorthover » Sun Aug 01, 2010 12:53 pm UTC

Berengal wrote:Yes, they work. PostScript itself is highly redundant being a programming language and all, and in this case the large files are just a concatenation of thousands of smaller files, each with the same layout which is also very repetitive. The actual data content of each file is very small. I wouldn't have been too surprised if we achieved a 1:2000 compression ratio.

Perhaps inter-file compression being disabled in some applicable sense? You could try compressing a tar of the lot via stdin. That'd tell you something useful.

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Sun Aug 01, 2010 1:45 pm UTC

PM 2Ring wrote:And if all the concatenated images have the same large identical prolog, there's plenty of scope for compression. If that's the case, it certainly explains the size difference between compressing a disk file v compressing stdin.
That's what I've come to suspect as well. Perhaps if I can somehow force the buffer-size to span some of the smaller files it'll better notice the large redundancies, not just the small ones.
PM 2Ring wrote:Or to remove them all, and replace them with a customized prolog.
While incredibly tempting, these files have to be generated from pdfs. I'm not about to start mucking around in machine-generated ps. It's also a bit more work than I'm willing to put in to fix something that already works. I really just don't want those huge temporary files to hit disk unnecessarily.
TNorthover wrote:Perhaps inter-file compression being disabled in some applicable sense? You could try compressing a tar of the lot via stdin. That'd tell you something useful.
They're already concatenated into one large file.
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Sun Aug 01, 2010 2:09 pm UTC

Berengal wrote:
PM 2Ring wrote:And if all the concatenated images have the same large identical prolog, there's plenty of scope for compression. If that's the case, it certainly explains the size difference between compressing a disk file v compressing stdin.
That's what I've come to suspect as well. Perhaps if I can somehow force the buffer-size to span some of the smaller files it'll better notice the large redundancies, not just the small ones.

It's certainly worth a try. I just had a quick look at the Wikipedia pages on RAR & PPM. It doesn't give much info, but I guess that if you make the buffer size greater than 2 of the smaller files, the pattern prediction ought to be able to notice the prolog duplication. But without knowing the details of how the PPM works, it's hard to see how well it'll be able to filter out the large-scale redundancy.

Berengal wrote:
PM 2Ring wrote:Or to remove them all, and replace them with a customized prolog.
While incredibly tempting, these files have to be generated from pdfs. I'm not about to start mucking around in machine-generated ps. It's also a bit more work than I'm willing to put in to fix something that already works. I really just don't want those huge temporary files to hit disk unnecessarily

Fair enough. But if the prologs are identical, it won't be too hard to remove all but the first one, and the resulting file should still print identically.

What are you using to do the PDF to PS conversion? I've used both Ghostscript & ImageMagick to do that (and fiddled with the output using awk). They each have their strength & weaknesses, but it's been a while since I've delved into them, so I can't remember which is better for what. :) IIRC, gs does exactly what you tell it to do, whereas IM's convert does extra things (like dithering images) it thinks are helpfull.

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Sun Aug 01, 2010 3:06 pm UTC

PM 2Ring wrote:What are you using to do the PDF to PS conversion?
Ghostscript, and a regex to touch up some tray info (which incidentally also breaks PS viewers' ability to render the file for some reason).

What I'd really like is to get rid of rar altogether and use zip since then I could outsource the heavy lifting to some of our sparc-boxen, using either the Java built-in ZipOutputStream (or GZIPOutputStream, but they seem to be the same, modulo zip's archiving abilities which I don't need) or native unix tools (but that would require those to be installed on the windows machines running this as well). For this to happen though I need to figure out a way to make it compress better than the 1:5 ratio I get now. So far my experiments with buffer sizes haven't shown any difference in compression, but I don't have access to the actual files I'll be compressing now.
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Coding: Fleeting Thoughts

Postby 0xBADFEED » Sun Aug 01, 2010 4:38 pm UTC

Berengal wrote:So I was compressing some 500 mb large postscript files down to about 1.5mb using winrar, which was all well and good except those large files were just intermediary files that I didn't want to store on a harddrive. ...

For this to happen though I need to figure out a way to make it compress better than the 1:5 ratio I get now. So far my experiments with buffer sizes haven't shown any difference in compression

This may be a stupid suggestion, but have you tried just keeping the entire thing in memory and then running the compression? I mean 500 MB is a lot but it's not that huge considering machines with 6+ GB RAM are fairly normal now. Although I don't know what sort of environment you're trying to run this process in.

Also it wasn't clear (to me at least) if you're using different utilities for stream-based vs file-based compression.

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Sun Aug 01, 2010 5:01 pm UTC

Berengal wrote:
PM 2Ring wrote:What are you using to do the PDF to PS conversion?
Ghostscript, and a regex to touch up some tray info (which incidentally also breaks PS viewers' ability to render the file for some reason).

That sounds like the regex is changing the DSC in a way that the viewers can't cope with, which isn't good. But DSC is one of those parts of PS that I'm not impressed with, and have been bitten by when I've tried to modify them using regex-based techniques.

And I still think you should take a peek to see if there are multiple prologs per file, and if those prologs are identical. :)

Berengal wrote:What I'd really like is to get rid of rar altogether and use zip since then I could outsource the heavy lifting to some of our sparc-boxen, using either the Java built-in ZipOutputStream (or GZIPOutputStream, but they seem to be the same, modulo zip's archiving abilities which I don't need) or native unix tools (but that would require those to be installed on the windows machines running this as well). For this to happen though I need to figure out a way to make it compress better than the 1:5 ratio I get now. So far my experiments with buffer sizes haven't shown any difference in compression, but I don't have access to the actual files I'll be compressing now.

I agree that GZIP is a good idea. OTOH, BADFEED's suggestion has merit: can you mount a large ramdisk to hold the files? I don't use Windows much, so I don't know if you can get a freeware ramdisk that's big enough. And who'd want to pay cash for something like that just to see if it'll work? :)

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Sun Aug 01, 2010 5:32 pm UTC

Keeping the entire thing in memory is exactly what I'm trying to do. If I can get gzip to compress this well then there's no problem since that's a stream compressor and doesn't need a ramdisk anyway. Another advantage of gzip (or a pure Java implementation with no dependencies on external tools) is that I can then just throw it on our big irons, which are all sparc-boxen running solaris, and laugh at any performance problems we've had. If not, then the app needs to be distributable because we don't have any good windows machines (the current crappy version of the program does just this. It also has some windows dependencies other than winrar that are harder to fix, but they're not resource intensive and can all be easily collected in a server/distributor process). Distributable in this context means regular workstations running with limited permissions. Putting a ramdisk on those is not really tempting.

0xBADFEED wrote:Also it wasn't clear (to me at least) if you're using different utilities for stream-based vs file-based compression.
I'm using winrar for both the file-based and stream-based rar compression. Gzip is a stream compressor already.
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

troyp
Posts: 557
Joined: Thu May 22, 2008 9:20 pm UTC
Location: Lismore, NSW

Re: Coding: Fleeting Thoughts

Postby troyp » Sun Aug 01, 2010 10:50 pm UTC

This is probably a stupid suggestion, but have you tried archiving the small files with tar, rather than concatenating them (you can always concatenate them at the other end)?
I'm just thinking that tar is designed to archive a bunch of often similar files and to work with compression, so maybe it'll work better for whatever reason.
I don't know much about compression and archiving formats, though, so I'm just guessing.

User avatar
e^iπ+1=0
Much, much better than Gooder
Posts: 2065
Joined: Sun Feb 15, 2009 9:41 am UTC
Location: Lancaster

Re: Coding: Fleeting Thoughts

Postby e^iπ+1=0 » Wed Aug 04, 2010 4:51 pm UTC

while (1)
{
}
you.GiveUp();
you.LetDown();
around.Run() && you.Desert();
you.MakeCry();
goodbye.Say();
lie.Tell() && you.Hurt();
poxic wrote:You, sir, have heroic hair.
poxic wrote:I note that the hair is not slowing down. It appears to have progressed from heroic to rocking.

(Avatar by Sungura)

User avatar
Dason
Posts: 1311
Joined: Wed Dec 02, 2009 7:06 am UTC
Location: ~/

Re: Coding: Fleeting Thoughts

Postby Dason » Wed Aug 04, 2010 4:52 pm UTC

I'm not sure if you should be proud or ashamed of that.
double epsilon = -.0000001;

User avatar
Xeio
Friends, Faidites, Countrymen
Posts: 5101
Joined: Wed Jul 25, 2007 11:12 am UTC
Location: C:\Users\Xeio\
Contact:

Re: Coding: Fleeting Thoughts

Postby Xeio » Wed Aug 04, 2010 5:38 pm UTC

e^iπ+1=0 wrote:while (1)
{
}
me.GiveUp(you);
me.LetDown(you);
me.Run(around) && me.Desert(you);
me.MakeCry(you);
me.Say(goodbye);
me.Tell(lies) && me.Hurt(you);
[/better OO approach][/stupid corrections]

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Coding: Fleeting Thoughts

Postby Xanthir » Wed Aug 04, 2010 6:36 pm UTC

Thanks, Xeio. It's not just a better OO approach, it's actually a correct handling of the requirements document. The previous code did the opposite of the requirements in many cases.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
e^iπ+1=0
Much, much better than Gooder
Posts: 2065
Joined: Sun Feb 15, 2009 9:41 am UTC
Location: Lancaster

Re: Coding: Fleeting Thoughts

Postby e^iπ+1=0 » Wed Aug 04, 2010 6:39 pm UTC

Found it on another site, don't blame me. I don't even really code, I just know some python that I use almost exclusively for Project Euler problems.
poxic wrote:You, sir, have heroic hair.
poxic wrote:I note that the hair is not slowing down. It appears to have progressed from heroic to rocking.

(Avatar by Sungura)

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Wed Aug 04, 2010 6:41 pm UTC

Code: Select all

main =
  let rick = rick
      roll = flip sequence you $ map ($me)
                                   [ giveUp
                                   , letDown
                                   , (&&) <$> const . (run around) <*> desert
                                   , makeCry
                                   , const . (say goodbye)
                                   , (&&) <$> const . tellLies <*> hurt
                                   ]
  in const rick roll

[/made functional]
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

User avatar
TheChewanater
Posts: 1279
Joined: Sat Aug 08, 2009 5:24 am UTC
Location: lol why am I still wearing a Santa suit?

Re: Coding: Fleeting Thoughts

Postby TheChewanater » Thu Aug 05, 2010 1:41 am UTC

Code: Select all

for (so_long)
{
  me.know (you);
}
 


Also,

Code: Select all

action:
  if [ $(you) == "cry" ]; then exit; fi

Code: Select all

$ make you=cry
$


More yet:

Code: Select all

assert (we != dynamic_cast<Love> (strangers));
ImageImage
http://internetometer.com/give/4279
No one can agree how to count how many types of people there are. You could ask two people and get 10 different answers.

User avatar
RoadieRich
The Black Hand
Posts: 1037
Joined: Tue Feb 12, 2008 11:40 am UTC
Location: Behind you

Re: Coding: Fleeting Thoughts

Postby RoadieRich » Thu Aug 05, 2010 1:59 am UTC

Code: Select all

if this == reallife
elif this == fantasy
catch Landslide
unescape(reality)
e = open("Your eyes")
e.lookup("skies") && e.see()
assert isinstance(me, poorboy)
assert needs_sympathy(me) == False
def needs_sympathy(person):
    return not (hasattr(person, easy_come) and hasattr(person, easy_go))
with little:
    high()
    low()
wind.blow(anywhere)
assert matters(to_me) == False
to(me)

Just don't try singing this with the -o flag enabled.
73, de KE8BSL loc EN26.

elminster
Posts: 1560
Joined: Mon Feb 26, 2007 1:56 pm UTC
Location: London, UK, Dimensions 1 to 42.
Contact:

Re: Coding: Fleeting Thoughts

Postby elminster » Thu Aug 05, 2010 3:01 am UTC

@RoadieRich: Oh wow. I laughed pretty hard at that.
Image

User avatar
TheChewanater
Posts: 1279
Joined: Sat Aug 08, 2009 5:24 am UTC
Location: lol why am I still wearing a Santa suit?

Re: Coding: Fleeting Thoughts

Postby TheChewanater » Thu Aug 05, 2010 3:23 am UTC

You know what would be cool? A natural-language generator that parses source code to turn it into (probably unintelligible) sentences.
ImageImage
http://internetometer.com/give/4279
No one can agree how to count how many types of people there are. You could ask two people and get 10 different answers.

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Thu Aug 05, 2010 8:57 am UTC

My Google-fu is failing me. :(

Is there a simple way to fetch JSON data and display it using JavaScript? Ideally, I'd like to parse it properly & pretty-print it, but at this stage I'd be happy just to display it in an alert. If someone could post some example code using http://xkcd.com/info.0.json, I'd be most grateful.

TIA

Maelstrom.
Posts: 76
Joined: Tue Oct 21, 2008 12:18 pm UTC

Re: Coding: Fleeting Thoughts

Postby Maelstrom. » Thu Aug 05, 2010 9:44 am UTC

PM 2Ring wrote:My Google-fu is failing me. :(

Is there a simple way to fetch JSON data and display it using JavaScript? Ideally, I'd like to parse it properly & pretty-print it, but at this stage I'd be happy just to display it in an alert. If someone could post some example code using http://xkcd.com/info.0.json, I'd be most grateful.

TIA


Unfortunately, fetching stuff like a JavaScript document from an external site is rather tricky, for security reasons. Tricks such as JSONP are used to get around this limitation, but the web server/script has to be set up to handle JSONP requests. I dont know if the XKCD scripts are set up to handle JSONP, unfortunately.

If you just want to fetch something from another server for the purpose of testing, try something like Twitter or Flickr. They both have public APIs with JSONP functionality.

If you want the XKCD data specifically, try running it through this Yahoo Pipe. Yahoo Pipes take in data from various sources (JSON, XML, RSS, HTML, ETC), gather, format, sort, filter them, and then output them in the requested format. They are kinda cool. There is a pipe that allows you to grab any JSON feed and turn it in to a JSONP request. For example:
http://pipes.yahoo.com/pipes/pipe.run?_id=29053b7ff74d5086a97cb14ad3ba0aba&_render=json&url=http%3A%2F%2Fxkcd.com%2Finfo.0.json&_callback=jsonp_callback

Note the paramaters. _id is the pipe id. _render is the format (json, xml, rss, etc)._callback is the paramater you would be interested in most. By supplying jsonp_callback here, the output of the pipe is sent to the jsonp_callback function when it all loads.

As to actually making a JSONP request, that depends upon your JavaScript environment. If you're using a framework like JQuery or MooTools, then they have their own implementations of JSONP, which are very handy. Otherwise, you're going to have to make it all yourself.

Good luck :)

User avatar
PM 2Ring
Posts: 3715
Joined: Mon Jan 26, 2009 3:19 pm UTC
Location: Sydney, Australia

Re: Coding: Fleeting Thoughts

Postby PM 2Ring » Thu Aug 05, 2010 11:36 am UTC

Maelstrom. wrote:
PM 2Ring wrote:My Google-fu is failing me. :(

Is there a simple way to fetch JSON data and display it using JavaScript? Ideally, I'd like to parse it properly & pretty-print it, but at this stage I'd be happy just to display it in an alert. If someone could post some example code using http://xkcd.com/info.0.json, I'd be most grateful.

TIA


Unfortunately, fetching stuff like a JavaScript document from an external site is rather tricky, for security reasons. Tricks such as JSONP are used to get around this limitation, but the web server/script has to be set up to handle JSONP requests. I dont know if the XKCD scripts are set up to handle JSONP, unfortunately.

If you just want to fetch something from another server for the purpose of testing, try something like Twitter or Flickr. They both have public APIs with JSONP functionality.

If you want the XKCD data specifically, try running it through this Yahoo Pipe. Yahoo Pipes take in data from various sources (JSON, XML, RSS, HTML, ETC), gather, format, sort, filter them, and then output them in the requested format. They are kinda cool. There is a pipe that allows you to grab any JSON feed and turn it in to a JSONP request. For example:
http://pipes.yahoo.com/pipes/pipe.run?_id=29053b7ff74d5086a97cb14ad3ba0aba&_render=json&url=http%3A%2F%2Fxkcd.com%2Finfo.0.json&_callback=jsonp_callback

Note the paramaters. _id is the pipe id. _render is the format (json, xml, rss, etc)._callback is the paramater you would be interested in most. By supplying jsonp_callback here, the output of the pipe is sent to the jsonp_callback function when it all loads.

As to actually making a JSONP request, that depends upon your JavaScript environment. If you're using a framework like JQuery or MooTools, then they have their own implementations of JSONP, which are very handy. Otherwise, you're going to have to make it all yourself.

Good luck :)

Thanks, Maelstrom.

I'm not quite sure how to use that Yahoo pipe. How do I "call" it from Javascript? Also, to process the returned data, I assume my Javascript has to define a function called jsonp_callback() - is that correct?

It all sounds a bit too complicated. I don't know jQuery (yet), and I can do simple parsing of JSON in the shell using wget & awk, or Python, but I was hoping to make something that can run in a basic HTML page, or even better, as a simple javascript: bookmarklet. Oh well.

Ubik
Posts: 1016
Joined: Thu Oct 18, 2007 3:43 pm UTC

Re: Coding: Fleeting Thoughts

Postby Ubik » Thu Aug 05, 2010 1:15 pm UTC

A joke:
- How do you use the database abstraction of Zend Framework or Doctrine when there is no PDO driver matching the database?
- You just don't and you take the slow and painful road of writing a thin and shitty DB abstraction layer yourself after futile attempts to find a relatively nice workaround.
Haha.

User avatar
Rippy
Posts: 2101
Joined: Sun Jul 22, 2007 11:27 pm UTC
Location: Ontario, Can o' Duh

Re: Coding: Fleeting Thoughts

Postby Rippy » Thu Aug 05, 2010 3:31 pm UTC

Any suggestions for writing a loop for continually prompting for user commands then executing them in Haskell? Best I can come up with right now is a recursive function with pattern-matching on the "quit" command to break the loop, but due to that exit strategy the function has to be seeded with the first line of input, which is kind of ugly.

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Coding: Fleeting Thoughts

Postby Berengal » Thu Aug 05, 2010 4:01 pm UTC

Rippy wrote:Any suggestions for writing a loop for continually prompting for user commands then executing them in Haskell? Best I can come up with right now is a recursive function with pattern-matching on the "quit" command to break the loop, but due to that exit strategy the function has to be seeded with the first line of input, which is kind of ugly.

Pull the recursion out. Here's a simple way of doing it, but you can easily complicate it to fit your needs.

Code: Select all

import Control.Monad

loopPrompt exit execute = go
  where
    go = do
      line <- getLine
      when (line /= exit)
        $ execute line >> go

execute "hello" = putStrLn "Hello back to you"
execute "Oh me yarm!" = putStrLn "lol"
execute _ = putStrLn "What what in the butt"

main = loopPrompt "quit" execute
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

User avatar
Rippy
Posts: 2101
Joined: Sun Jul 22, 2007 11:27 pm UTC
Location: Ontario, Can o' Duh

Re: Coding: Fleeting Thoughts

Postby Rippy » Thu Aug 05, 2010 4:43 pm UTC

Thanks for the nice example there. I hadn't gotten to monads in my Haskell readings yet; since I have the time I think I'll read up on that before I delve back into my project and make changes. (For the record, it's a database of the locations of my physical stuff, making somewhat unnecessary use of SQL for learning purposes.)

User avatar
levicc00123
Posts: 165
Joined: Thu Jan 03, 2008 5:33 pm UTC
Location: Sterling, CO
Contact:

Re: Coding: Fleeting Thoughts

Postby levicc00123 » Thu Aug 05, 2010 11:17 pm UTC

RT: I find I can concentrate more if I set a playlist in rhythmbox, it not only provides music, but a timer where I make a deal with myself where if I sit still and get some code written until the playlist ends, I'll take a 30-minute break.

RT: I'm enjoying learning Haskell using Learn you a haskell for great goo and yaht, I never knew haskell could be so much fun!
Image

Maelstrom.
Posts: 76
Joined: Tue Oct 21, 2008 12:18 pm UTC

Re: Coding: Fleeting Thoughts

Postby Maelstrom. » Thu Aug 05, 2010 11:33 pm UTC

PM 2Ring wrote:Thanks, Maelstrom.

I'm not quite sure how to use that Yahoo pipe. How do I "call" it from Javascript? Also, to process the returned data, I assume my Javascript has to define a function called jsonp_callback() - is that correct?

It all sounds a bit too complicated. I don't know jQuery (yet), and I can do simple parsing of JSON in the shell using wget & awk, or Python, but I was hoping to make something that can run in a basic HTML page, or even better, as a simple javascript: bookmarklet. Oh well.


Essentially, to make a JSONP request, your script needs to create a new JavaScript code element, with its src paramater pointing to the JSONP feed. An element such as:

Code: Select all

<script src="http://example.com/request.jsonp?callback=jsonp_callback"></script>


This will then go fetch the JSONP feed, and execute it as if it is a normal JavaScript code block. To create this script block, I think you can just use normal DOM methods like

Code: Select all

var jsonp_callback = function(data) {
    // do something with the data here
}
var jsonpFeedUrl = "http://pipes.yahoo.com/pipes/pipe.run?_id=29053b7ff74d5086a97cb14ad3ba0aba&_render=json&url=http%3A%2F%2Fxkcd.com%2Finfo.0.json&_callback=jsonp_callback";
var script = document.createElement('script');
script.src = jsonpFeedUrl;
document.body.appendChild(script);


I have never done it that way myself. I usually use frameworks that take care of this stuff for me. For example, in MooTools:

Code: Select all

new Request.JSONP({
  url: http://pipes.yahoo.com/pipes/pipe.run,
  data: {
    _id: "29053b7ff74d5086a97cb14ad3ba0aba",
    _render: "json",
    url: "http://xkcd.com/info.0.json",
  },
  callbackKey: '_callback',
  onSuccess: function(results) {
    //Do something with the results here.
  }
});


JQuery has something similar I am sure. I highly recommend using a framework if you can. They just make life so much simpler.

User avatar
RoadieRich
The Black Hand
Posts: 1037
Joined: Tue Feb 12, 2008 11:40 am UTC
Location: Behind you

Re: Coding: Fleeting Thoughts

Postby RoadieRich » Fri Aug 06, 2010 12:14 am UTC

Maelstrom. wrote:Essentially, to make a JSONP request, your script needs to create a new JavaScript code element, with its src paramater pointing to the JSONP feed. An element such as:

Code: Select all

<script src="http://example.com/request.jsonp?callback=jsonp_callback"></script>

I'm probably missing something, but what's wrong with putting the script block in manually?
73, de KE8BSL loc EN26.

User avatar
phlip
Restorer of Worlds
Posts: 7573
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia
Contact:

Re: Coding: Fleeting Thoughts

Postby phlip » Fri Aug 06, 2010 1:21 am UTC

PM 2Ring wrote:Unfortunately, fetching stuff like a JavaScript document from an external site is rather tricky, for security reasons. Tricks such as JSONP are used to get around this limitation, but the web server/script has to be set up to handle JSONP requests. I dont know if the XKCD scripts are set up to handle JSONP, unfortunately.

xkcd does support JSONP - unixkcd uses it (since that's technically cross-domain). Current comic, specific comic.

RoadieRich wrote:I'm probably missing something, but what's wrong with putting the script block in manually?

Nothing... and if you just want a static page that, say, shows the current comic, then you could have a page that has the script block in there directly... something along the lines of:

Code: Select all

<script type="text/javascript">
var details = {};
function dataloaded(o)
{
  details = o;
}
function pageloaded()
{
  with(document.getElementById("image"))
  {
    src = details.img;
    alt = details.title;
    title = details.alt;
  }
}
</script>
<script type="text/javascript" src="http://dynamic.xkcd.com/api-0/jsonp/comic/?callback=dataloaded"></script>
<body onload="pageloaded()"><img id="image"></body>

However, if you want to be able to change the image dynamically, then you're going to want to add those external script tags on the fly, depending on what comic you want to display.

Code: Select all

enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵​╰(ಠ_ಠ ⚠) {exit((int)⚠);}
[he/him/his]

User avatar
RoadieRich
The Black Hand
Posts: 1037
Joined: Tue Feb 12, 2008 11:40 am UTC
Location: Behind you

Re: Coding: Fleeting Thoughts

Postby RoadieRich » Fri Aug 06, 2010 2:34 am UTC

ITT: I discover another annoying quirk of HTML: behaviour on changing the src attribute is not consistent between script and img tags. I guess it's a security thing.
73, de KE8BSL loc EN26.

Maelstrom.
Posts: 76
Joined: Tue Oct 21, 2008 12:18 pm UTC

Re: Coding: Fleeting Thoughts

Postby Maelstrom. » Fri Aug 06, 2010 2:44 am UTC

RoadieRich wrote:ITT: I discover another annoying quirk of HTML: behaviour on changing the src attribute is not consistent between script and img tags. I guess it's a security thing.

Hence why I say use a framework. They worry about all the nitty-gritty of cross browser support so that you don't have to. XMLHttpRequest, JSONP, JSON parsing, animations, opacity; all of these things are done differently in different browser (*glares at IE*), but I've never had to worry too much about it. MooTools worries about it for me.

User avatar
phlip
Restorer of Worlds
Posts: 7573
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia
Contact:

Re: Coding: Fleeting Thoughts

Postby phlip » Fri Aug 06, 2010 4:12 am UTC

RoadieRich wrote:ITT: I discover another annoying quirk of HTML: behaviour on changing the src attribute is not consistent between script and img tags. I guess it's a security thing.

It's not strictly a security thing, but also that scripts are simply a different beast to images. When an img tag is read, nothing especially fancy needs to happen immediately... the image itself gets added to the list of things that need to be downloaded, which doesn't need to happen immediately (but needs to happen before the page can be considered loaded). Then, as long as the image tag is there, the picture is persistently in the document, and can be manipulated. A script on the other hand, when it's read, needs to be downloaded and run immediately so that scripts get run in the right order (unless you set the async attribute to say it's not necessary). Then once it's run, the tag is effectively inert - it won't do anything again. The script can leave persistent effects (defined functions/vars and event handlers and whatnot) but the script tag itself has nothing more to do. You could remove it without changing anything.

So yeah. To trigger a new script to run, you need to add a new script tag, you can't change an existing one... because existing ones don't do anything any more.

Code: Select all

enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵​╰(ಠ_ಠ ⚠) {exit((int)⚠);}
[he/him/his]


Return to “Coding”

Who is online

Users browsing this forum: No registered users and 7 guests