| 
		
	
		
		
		
		 
			
			I'd like to see the idea of the zombies integrating into society explored more.  
 
I don't want to see anymore comedy orientated zombie flicks that shit's been done to death. 
 
Personally I think the only route for the subgenre to go is a TV series.  I could see that working really well... maybe The Walking Dead series that was rumoured could be what I'm after.
		 
		
		
		
		
		
		
		
		
	
	 |