Set each cell encoding to ENCODING_UTF_16. Write 6 cells in a row. The 2nd, 3rd, 4th and 5th cells have long strings (9000 characters). The 1st and last are very short. Try to open the generated excel, got "Not Enough Memory" or "Corrupted file". See attached program. However, the file will be fine with out the last cell (Comment out line 40 of the attached file) or without the encoding. import java.io.FileOutputStream; import org.apache.poi.hssf.usermodel.HSSFCell; import org.apache.poi.hssf.usermodel.HSSFCellStyle; import org.apache.poi.hssf.usermodel.HSSFFont; import org.apache.poi.hssf.usermodel.HSSFRow; import org.apache.poi.hssf.usermodel.HSSFSheet; import org.apache.poi.hssf.usermodel.HSSFWorkbook; import org.apache.poi.hssf.util.HSSFColor; public class PoiTest { private int longStringSize = 9000; public static void main(String[] args) { PoiTest pt = new PoiTest(); pt.writeExcel(); } public PoiTest() { } public void writeExcel(){ try { HSSFWorkbook wb = new HSSFWorkbook(); HSSFSheet sheet = wb.createSheet("new sheet"); // Create a row and put some cells in it. Rows are 0 based. HSSFRow row = sheet.createRow((short)0); // Create a cell and put a value in it. String longString = createLongString(); createCellWithString(row, 0, "1"); createCellWithString(row, 1, longString); createCellWithString(row, 2, longString); createCellWithString(row, 3, longString); createCellWithString(row, 4, longString); createCellWithString(row, 5, "end"); // Write the output to a file FileOutputStream fileOut = new FileOutputStream("workbook.xls"); wb.write(fileOut); fileOut.close(); } catch (Exception e) { e.printStackTrace(); } } private void createCellWithString(HSSFRow row, int cellId, String cellStr) { HSSFCell cell = row.createCell((short) cellId); cell.setEncoding(HSSFCell.ENCODING_UTF_16); cell.setCellValue(cellStr); } private String createLongString() { StringBuffer sb = new StringBuffer(); for (int i=0; i<longStringSize; i++) { sb.append('a'); } sb.append(longStringSize); return sb.toString(); } }
upgrade.
Just tried with the latest version 1.10. Still got the same problem.
*** Bug 20045 has been marked as a duplicate of this bug. ***
This needs to be fixed, but I think it makes more sense to do it in 3.0. 9000 characters is an awfully long strong. I suspect I know why. I bet that we're not continuing such a thing correctly.... Its suspiciously close to the max SST record size.
Fixed in head. Problem with sst serializer dealing with multiple continue records.