I aim to create custom Encoder based on Codable API and itend to give a lot of funcionality
for free by relying on default encode(to: ) implementation and not requiring to create custom implementation.
It works as expected for structures, but with class instances which use inheritance i stumbled
on behaviour which would encode only superclass keys.
Here's the test function i'm testing:
func testNestedClassWCodingKeys() {
class L1: Codable {
init() {
L1_A_KEY="AAAA"
L1_B_KEY=222
L1_D_KEY=4.4
}
var L1_A_KEY: String
var L1_B_KEY: Int
var L1_C_KEY: Int = 2
var L1_D_KEY: Float
}
class L2:L1 {
override init() {
L2_A_KEY="L2222"
L2_B_KEY=222333
L2_C_KEY=L3(L3_A_KEY: "L3333", L3_B_KEY: 333)
super.init()
}
required init(from decoder: Decoder) throws {
fatalError("init(from:) has not been implemented")
try super.init(from: decoder)
}
var L2_A_KEY: String
var L2_B_KEY: Int
struct L3: Codable {
var L3_A_KEY: String
var L3_B_KEY: Int
}
var L2_C_KEY: L3
}
let t = L2()
debugPrint(t)
let encoder = TDGBinaryEncoder()
XCTAssertNoThrow( try encoder.encode(t))
}
As you see we don't use any custom encode(to:) implementations.
The first thing our encoder do when in encode()(line 44) is called is to respect Encodable encode(to: ) implementaion and call it.
func encode(_ encodable: Encodable) throws {
//respecting default and custom implementations
debugPrint("request for encoding",encodable)
try encodable.encode(to: self)
}
then as expected, default encode(to:) implementation tries to encode super keys for L1.
here's the debug output:
"line 38: creating keyed container"
"line 74: encoding L1_A_KEY"
"line 74: encoding L1_B_KEY"
"line 74: encoding L1_C_KEY"
"line 74: encoding L1_D_KEY"
And then it stops...
But we expected default encoding implementation to call nested keyed container on L2
Could You please comment why it's not encoding L2 keys ?
P.S. We can send whole project for investigation