1317 messages over 165 pages: << Previous 1 2 3 4 5 6 7 ... 116 ... 164 165 Next >>
sctroyenne Diglot Senior Member United StatesRegistered users can see my Skype Name Joined 5389 days ago 739 posts - 1312 votes Speaks: English*, French Studies: Spanish, Irish
| Message 921 of 1317 30 January 2014 at 6:38am | IP Logged |
emk wrote:
Oh, and I just received the French dub of Neon Genesis Evangelion. After watching two episodes, two thoughts: (1) it's possible that this is actually pretty good, but (2) the animation of the various female characters is pretty ridiculous—like a bad comic book. Anyway, this is supposed to be one of the all-time best series of this sort, so I'll give it a few more episodes. |
|
|
Finally one that I can comment on :) I had a roommate that was a big anime fan and she got the whole collection and for some reason I was hooked. I think it was a combination of trying to figure out all the biblical references and the fact that the violence wasn't so graphic as to turn me off completely but graphic enough to haunt me. It was also one of my first attempts at "native" material (as opposed to educational material). I watched a few episodes on YouTube and I can still remember learning the word "cible" from it.
1 person has voted this message useful
|
emk Diglot Moderator United States Joined 5530 days ago 2615 posts - 8806 votes Speaks: English*, FrenchB2 Studies: Spanish, Ancient Egyptian Personal Language Map
| Message 922 of 1317 30 January 2014 at 6:37pm | IP Logged |
sctroyenne wrote:
Finally one that I can comment on :) I had a roommate that was a big anime fan and she got the whole collection and for some reason I was hooked. I think it was a combination of trying to figure out all the biblical references and the fact that the violence wasn't so graphic as to turn me off completely but graphic enough to haunt me. |
|
|
OK, if the story's good enough (which seems possible so far), I'll put up with the comic book super-heroine art for a while longer.
Deep Learning
This is a post for patrickwilken, and any other geeks in the audience.
I'm a bit of an AI hobbyist. Up until about 2006 or so, there were two really good tools in the machine-learning toolbox: naive Bayes and single-value decomposition (SVD). Naive Bayes was the basis for most spam filters, and SVD was used for everything from face recognition to text classification to Netflix recommendations. (There were a few other tools in the toolbox, of course. But these were two with excellent bang for the buck.)
Oddly, one tool that wasn't in the AI toolbox was artificial neural networks, because they were too hard to train for anything but simple problems. Some background:
- An artificial neuron is basically a non-linear sum: It takes a certain number of inputs, applies a weight to each, and it fires if the result is above a threshold.
- A neural network is essentially a series of layers of artificial neurons. The first layer is hooked up to the input, and each following layer is hooked up to the previous layer.
- Until about 2006, we could train neural networks that were about 3 layers deep using a technique called "back propagation". Basically this involved looking at the final outputs, seeing whether they were correct, and trying to push corrections back through the layers of the network. But those corrections didn't propagate well after a few layers.
So basically, for a couple of decades, we've been restricted to very shallow neural nets. It just wasn't enough to present a neural net with a series of correct/incorrect examples, and to whack it with a stick until it got the right answer. Something was missing.
What changed? Well, if you want the citations and the theory, see this survey paper/book. But in a nutshell, researchers realized that neural networks need to be trained from both ends:
1. Starting from the input layer, each layer can train itself to represent the input efficiently, without trying to make any judgments about it. This involves exposing the layer to lots and lots of input, and tweaking it until it can reduce those inputs to a terse representation (analogous to the principal eigenvectors) and then reconstruct the original input reasonably accurately. Once the first layer looks good, training can move onto the second layer.
2. Once each of the layers has self-trained to represent the input accurately, then the network can be trained from the top down, by presenting it with various examples and the desired output for each example.
Anyway, what these results indicate is that neural nets can learn quite a bit by simple exposure. They can find patterns in the input, discover regularities, and build up various clever ways to represent features of the input. And if you don't allow a deep neural net to mull over plenty of input first, it's fiendishly hard to do any high-level training.
Now, I don't want to claim that this is actually how language works in the brain. But it's certainly a lovely metaphor, isn't it? Exposure lays the groundwork for everything that comes later.
Edited by emk on 30 January 2014 at 6:38pm
4 persons have voted this message useful
| patrickwilken Senior Member Germany radiant-flux.net Joined 4531 days ago 1546 posts - 3200 votes Studies: German
| Message 923 of 1317 30 January 2014 at 7:32pm | IP Logged |
emk wrote:
This is a post for patrickwilken, and any other geeks in the audience. |
|
|
:) Good times..
Edited by patrickwilken on 30 January 2014 at 7:33pm
1 person has voted this message useful
| patrickwilken Senior Member Germany radiant-flux.net Joined 4531 days ago 1546 posts - 3200 votes Studies: German
| Message 924 of 1317 31 January 2014 at 11:34am | IP Logged |
emk wrote:
Oh, and I just received the French dub of Neon Genesis Evangelion. After watching two episodes, two thoughts: (1) it's possible that this is actually pretty good, but (2) the animation of the various female characters is pretty ridiculous—like a bad comic book. Anyway, this is supposed to be one of the all-time best series of this sort, so I'll give it a few more episodes. |
|
|
I missed the NGE reference. I have to admit I really enjoyed the original series, but perhaps my tastes are more low-brow than yours. Are you watching the original series or the remakes?
Have you watched Cowboy Bebop?
Edited by patrickwilken on 31 January 2014 at 11:42am
1 person has voted this message useful
| geoffw Triglot Senior Member United States Joined 4686 days ago 1134 posts - 1865 votes Speaks: English*, German, Yiddish Studies: Modern Hebrew, French, Dutch, Italian, Russian
| Message 925 of 1317 31 January 2014 at 4:09pm | IP Logged |
Forgive my ignorance, but isn't "Parallel Distributed Processing" redundant? Is it
possible to do parallel processing without distributing the code and data?
1 person has voted this message useful
|
emk Diglot Moderator United States Joined 5530 days ago 2615 posts - 8806 votes Speaks: English*, FrenchB2 Studies: Spanish, Ancient Egyptian Personal Language Map
| Message 926 of 1317 31 January 2014 at 4:21pm | IP Logged |
patrickwilken wrote:
I missed the NGE reference. I have to admit I really enjoyed the original series, but perhaps my tastes are more low-brow than yours. Are you watching the original series or the remakes? |
|
|
I think it's the original edition. Anyway, it's this one right here.
I have nothing against low-brow entertainment. I like Bunny Maloney, which is about as low-brow as it gets. :-) My problem with NGE is, well, the way that all of the women look like blow-up dolls, and the way that the camera is utterly obsessed with their bodies. It's distracting, and it feels like the show is trying way too hard to manipulate the viewer.
I have nothing against a tasteful amount eye-candy—if a James Bond movie wants to show off Daniel Craig's pecs, or if Ghost in the Shell: Stand Alone Complex wants to occasionally show off Major Kusanagi's looks, that doesn't bother me. And of course, different people have totally different thresholds for this sort of thing. But to me, NGE feels like it's trying a bit too hard.
I'll take a look at Cowboy Bebop.
Tara Duncan
I've been watching the occasional episode of the Tara Duncan TV series on VoilaTV. At first glance, it looks like one of those "I was a teenage witch" series, except aimed at a pre-teen audience. But I could describe it just as easily as "a French Harry Potter with a female protagonist" or as "a younger version of Buffy with less relationship angst." Quality-wise, the TV series is perfectly competent, but nobody would mistake it for a classic like Avatar.
Anyway, the TV series is based a series of books, and I ordered the first one on a whim. It's not Harry Potter, but you know, if you're looking for an older kid's fantasy series, it's not bad: a young protagonist with mysterious powers, family secrets, masked villains, and all that good stuff. Plus, there's something like 10 books now.
I know there's somebody reading this log who's going to say, "Hey, that sounds just like what I was looking for!" If that's you, go check it out. :-)
1 person has voted this message useful
| patrickwilken Senior Member Germany radiant-flux.net Joined 4531 days ago 1546 posts - 3200 votes Studies: German
| Message 927 of 1317 31 January 2014 at 4:43pm | IP Logged |
Neural Networks
geoffw wrote:
Forgive my ignorance, but isn't "Parallel Distributed Processing" redundant? Is it
possible to do parallel processing without distributing the code and data? |
|
|
PDP was meant as a sort of response to more traditional AI approaches that involved the step-by-step manipulation of symbols in a program.
In contrast in PDP approaches computation occurred in parallel, and the semantic meaning (the 'symbols') were distributed across multiple nodes within the computation architecture.
This is a helpful introduction: http://plato.stanford.edu/entries/connectionism/
Anime
Yes. That's the original series. The studio has been recutting and re-releasing new versions of the series as series of movies. 1.11, 2.22. etc.
emk wrote:
I have nothing against low-brow entertainment. I like Bunny Maloney, which is about as low-brow as it gets. :-) My problem with NGE is, well, the way that all of the women look like blow-up dolls, and the way that the camera is utterly obsessed with their bodies. It's distracting, and it feels like the show is trying way too hard to manipulate the viewer.
|
|
|
It's been ages since I saw the show, but I remember thinking that there was a lot of depth to all the main characters in the show, and the female characters were presented as strong and interesting individuals.
But, yeah, Japanese anime (at least from the 1980s-1990s) does have this weird way of presenting woman with huge eyes, and a sort of barbie childlike appearance. I haven't watched any anime for a long time so I don't know if the area has progressed or not in the last twenty years.
You are not alone: A quick search of Google with "sexism in anime" (including quotes) produces seventy-eight thousand hits.
------
For what it's worth I also remember enjoying Samurai Champloo.
Edited by patrickwilken on 31 January 2014 at 5:20pm
1 person has voted this message useful
| geoffw Triglot Senior Member United States Joined 4686 days ago 1134 posts - 1865 votes Speaks: English*, German, Yiddish Studies: Modern Hebrew, French, Dutch, Italian, Russian
| Message 928 of 1317 31 January 2014 at 4:55pm | IP Logged |
Sounds like you mean that "distributed" means that the data (semantic meaning, symbols)
is distributed, as opposed to merely distributing processing functions (e.g., optimized
code may execute different sections of a program in parallel, but each processing node
handles all the data for that particular section).
Not crucial. Perhaps I'm missing the boat because of my perspective. I've written code
for exploiting parallel processors before, but my AI experience is limited to cruising
through the intro/survey course.
1 person has voted this message useful
|
You cannot post new topics in this forum - You cannot reply to topics in this forum - You cannot delete your posts in this forum You cannot edit your posts in this forum - You cannot create polls in this forum - You cannot vote in polls in this forum
This page was generated in 0.4219 seconds.
DHTML Menu By Milonic JavaScript
|