i have solved the problem.
there was two posibilities:
1) The datanode is full. Please check if your datanode is full --->
i use this command $ dfs -du -h /
Know the disk space of data nodes in hadoop?
2) Or the second option is the datanode is not working fine. (your namenode could be working fine or not). (hdfs could let you create files but not write to them if namenode is up and datnode is down.)(if datanode and namenode are down, you could not create or write in the files).
you can check your connection to datanode in core-site.xml.
old-dated---------------:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://0.0.0.0:8020</value>
<description>Nombre del filesystem por defecto.</description>
</property>
</configuration>
up-dated---------------:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://ip_where_you_are_dating:8020</value>
<description>Nombre del filesystem por defecto.</description>
</property>
</configuration>
the second option:
I'm trying to write a array bytes in a file but i can not. (I can create this file in FileSystem, and the conection is working). Any help is grateful