If you want to disallow directories without disallowing files, you will need to use wildcards:
User-agent: *
Allow: /public/section1/
Disallow: /*/
The above will allow all of the following:
http://example.com/
http://example.com/somefile
http://example.com/public/section1/
http://example.com/public/section1/somefile
http://example.com/public/section1/somedir/
http://example.com/public/section1/somedir/somefile
And it will disallow all of the following:
http://example.com/somedir/
http://example.com/somedir/somefile
http://example.com/somedir/otherdir/somefile
Just be aware that wildcards are not part of the original robots.txt specification, and are not supported by all crawlers. They are supported by all of the major search engines, but there are many other crawlers out there that don't support them.