The cost function in logistic regression is more complex than linear regression. For example, mean squared error would yield a non-convex function with many local minimums, making it difficult to optimize with gradient descent. Cross entropy, also called log loss is the cost function used with logistic regression.