생활
안녕하세요 DeepMLP 코드 오류 해결좀 부탁드립니다.
colab으로 코드를 돌리고 있습니다.
코드 :
에러 :
--------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-20-bbbe48116659> in <module>() 43 44 mlp.compile(loss = 'mse', optimizer = Adam(learning_rate = 0.1), metrics = ['accuracy']) ---> 45 hist = mlp.fit(x_train, y_train, batch_size = 32000, epochs = 30, validation_data = (x_test, y_test), verbose = 2) 46 47 res = mlp.evaluate(x_test, y_test, verbose = 0)4 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing) 1145 use_multiprocessing=use_multiprocessing, 1146 model=self, -> 1147 steps_per_execution=self._steps_per_execution) 1148 1149 # Container that configures and calls `tf.keras.Callback`s. /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/data_adapter.py in get_data_handler(*args, **kwargs) 1362 if getattr(kwargs["model"], "_cluster_coordinator", None): 1363 return _ClusterCoordinatorDataHandler(*args, **kwargs) -> 1364 return DataHandler(*args, **kwargs) 1365 1366 /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/data_adapter.py in __init__(self, x, y, sample_weight, batch_size, steps_per_epoch, initial_epoch, epochs, shuffle, class_weight, max_queue_size, workers, use_multiprocessing, model, steps_per_execution, distribute) 1164 use_multiprocessing=use_multiprocessing, 1165 distribution_strategy=ds_context.get_strategy(), -> 1166 model=model) 1167 1168 strategy = ds_context.get_strategy() /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/data_adapter.py in __init__(self, x, y, sample_weights, sample_weight_modes, batch_size, epochs, steps, shuffle, **kwargs) 256 257 num_samples = set(int(i.shape[0]) for i in nest.flatten(inputs)).pop() --> 258 _check_data_cardinality(inputs) 259 260 # If batch_size is not passed but steps is, calculate from the input data. /usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/data_adapter.py in _check_data_cardinality(data) 1628 label, ", ".join(str(i.shape[0]) for i in nest.flatten(single_data))) 1629 msg += "Make sure all arrays contain the same number of samples." -> 1630 raise ValueError(msg) 1631 1632 ValueError: Data cardinality is ambiguous: x sizes: 32000 y sizes: 1600 Make sure all arrays contain the same number of samples.x와 y의 사이즈가 맞지 않아 오류가 발생되는 것 같습니다만, 코드 수정을 어떻게 해야 할 지 모르겠습니다.
아직 답변이 없어요.