If you are dealing with ELT scenario where you have to load huge volumes of files and process it later like filter, transform and load it to tranditional databases for analytics then you can use hadoop to load the files and then Netezza as the target staging or data warehouse area. With hadoop you can put all your files into HDFS and then read using ETL tool to tranform, filter, etc or use Hive SQL to write your query the data in those files. However, hadoop based data warehouse HIve does not support updates and does not support all the SQL statements. Hence, it is better to read those files from HDFS, apply filters, transformation and load the result to traditional data warehouse appliance such as netezza to write your queries for cubes.
If you are daily loading GB of data to netezza with landing, staging and mart area then most likely you will end up using a lot of space. In this scenario you can make your landing space to be on hadoop and then make your staging and mart areas to be netezza. If you queries are simple and you are not doing very complex filtering etc or updates to source may be you can manage everything with hadoop.
To conclude, hadoop is ideal for huge volumes of data but does not support all the functionality of a traditional data warehouse.
You can check out this link to see the differences:
http://dwbitechguru.blogspot.ca/2014/12/how-to-select-between-hadoop-vs-netezza.html