Even if init(_ value: Int) for some reason uses an intermediate Double, this would not explain the precision loss, since 15123122 is exactly representable as a 64-bit binary floating point number.
The behavior is different on Linux (Ubuntu 16.04, development snapshot from 8 June 2018):
Welcome to Swift version 4.2-dev (LLVM 031e148970, Clang b58a7ad218, Swift 4b9373622f). Type :help for assistance.
1> import Foundation
2> print(Decimal(15123122))
15123122
3>
4> Decimal(integerLiteral: 15123122) == Decimal(string: "15123122")!
$R0: Bool = true
5>
import Foundation
Decimal(integerLiteral: 15123122) == Decimal(string: "15123122") // true
let n = 15123122
let a = Decimal(n) // works
let b = Decimal(exactly: n) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
let c = Decimal(exactly: 15123122) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
let b = Decimal(exactly: n) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
let c = Decimal(exactly: 15123122) // error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
Looks like that was fixed !!! (Works now with Xcode 6.3)