Evan Martin (evan) wrote in evan_tech,
Evan Martin

a better graph

I finally figured out something that I've been fighting with forever: I have a bunch of data that I want to get the general trend of. My weak stats background makes me think of Kernel Smothing [that looks like a pretty good overview] but in statsland you're always doing it over probability distributions and not counts, like I have above.

The solution is do do it effectively manually: the R dnorm function is the density function of a normal distribution, and then the filter function effectively, uh, convolves that kernel against the data. But I'm not quite sure this is yet correct: does it, for each data point, sum in all the contributions of the points around it using the normal distribution? Or does each point contribute a normal distribution's worth of density to the points around it? I'm shamefully poor at this stuff.

In any case, the above was generated with:
plot(d$date, filter(d$count, dnorm(-40:40,sd=20)), type='l', main='Average Posts per Day', ylab='Posts', xlab='Year', frame.plot=F, lwd=2)

  • dremel

    They published a paper on Dremel, my favorite previously-unpublished tool from the Google toolchest. Greg Linden discusses it: "[...] it is capable…

  • google ime

    Japanophiles might be interested to learn that Google released a Japanese IME. IME is the sort of NLP problem that Google is nearly uniquely…

  • ghc llvm

    I read this thesis on an LLVM backend for GHC, primarily because I was curious to learn more about GHC internals. The thesis serves well as an…

  • Post a new comment


    default userpic
    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.