Advanced StreamReader and StreamWriter Techniques in .NET
Core File Handling Techniques
Working with StreamReader and StreamWriter goes beyond basic text operations. When dealing with structured data like arrays, you need strategic approaches. The initial challenge involves saving array contents to files. Consider a high scores scenario where integer values need persistence.
Initialize your array with hardcoded values for demonstration:
int[] highScores = { 12, 20, 25, 38, 47 };
Use StreamWriter with false to overwrite existing files:
using (StreamWriter writer = new StreamWriter("highscores.txt", false))
{
foreach (int score in highScores)
{
writer.WriteLine(score);
}
}
Critical step: Always wrap streams in using statements for automatic disposal. This prevents resource leaks better than manual Close() calls.
Reading Back into Arrays
Retrieving data requires knowing the data structure. For fixed-length arrays:
int[] loadedScores = new int[5];
using (StreamReader reader = new StreamReader("highscores.txt"))
{
for (int i = 0; i < loadedScores.Length; i++)
{
loadedScores[i] = int.Parse(reader.ReadLine());
}
}
Pro tip: Add error handling with int.TryParse() to avoid crashes on invalid data.
Multidimensional Data Handling
Writing 2D Arrays
Structured data like player names with scores requires careful mapping:
string[,] playerData = {
{ "David", "12" },
{ "Sally", "20" },
{ "Beatrix", "25" }
};
using (StreamWriter writer = new StreamWriter("player_scores.txt"))
{
for (int row = 0; row < playerData.GetLength(0); row++)
{
writer.WriteLine($"{playerData[row, 0]} {playerData[row, 1]}");
}
}
Reading Complex Structures
Avoid character-by-character parsing with this efficient method:
List<string[]> playerList = new List<string[]>();
using (StreamReader reader = new StreamReader("player_scores.txt"))
{
string line;
while ((line = reader.ReadLine()) != null)
{
string[] fields = line.Split(' ');
playerList.Add(fields);
}
}
Key advantage: Split() handles delimiters automatically, reducing code complexity by 70% compared to manual character checks. For production, use constant delimiters like pipes (|) to avoid space conflicts in names.
Advanced File Operations
Conditional File Copying
Go beyond simple duplication with data filtering:
using (StreamReader source = new StreamReader("highscores.txt"))
using (StreamWriter target = new StreamWriter("filtered_scores.txt"))
{
string scoreLine;
while ((scoreLine = source.ReadLine()) != null)
{
if (int.TryParse(scoreLine, out int score) && score >= 25)
{
target.WriteLine(score);
}
}
}
Character-Level Control
When you need granular control, use Peek() and Read():
using (StreamReader reader = new StreamReader("data.txt"))
{
int nextChar;
while ((nextChar = reader.Peek()) != -1)
{
char actualChar = (char)reader.Read();
// Process characters
}
}
ASCII decoding tip: Convert numeric values to characters only for display purposes, keeping logic numeric for efficiency.
Pro Optimization Strategies
Buffer Management
Streams use buffers automatically, but for large files:
- Increase buffer size via constructor parameters
- Use
Flush()selectively during long operations - Avoid
ReadToEnd()for multi-gigabyte files
Asynchronous Patterns
For responsive applications during file operations:
async Task WriteFileAsync()
{
using (StreamWriter writer = new StreamWriter("largefile.txt"))
{
await writer.WriteAsync(GenerateLargeContent());
}
}
Performance note: Async reduces UI freezing but adds complexity. Reserve for files >100MB.
Implementation Checklist
- Validate file paths with
Path.Combine()for cross-platform compatibility - Always wrap streams in
usingblocks - Handle exceptions for file access errors
- Test with special characters in data
- Benchmark large file operations
Recommended Resources
- Book: "CLR via C#" by Jeffrey Richter (deep dive into .NET I/O)
- Tool: LINQPad (instant testing of file operations)
- Library: FileHelpers (for complex structured files)
Which technique will you implement first? Share your approach in the comments!