Human Generated Data

Title

[Man in antique car]

Date

1942-1943

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.575.32

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man in antique car]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1942-1943

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.575.32

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Wheel 100
Machine 100
Wheel 99.4
Person 99.3
Human 99.3
Person 99.3
Model T 98.6
Antique Car 98.6
Vehicle 98.6
Transportation 98.6
Automobile 98.6
Car 98.1
Wheel 89.6
Tire 81.1
Person 63.5
Wheel 53.3

Clarifai
created on 2023-10-22

vehicle 99.9
people 99.7
transportation system 99.7
car 98.8
driver 97.3
vintage 97
carriage 96.8
wheel 96.3
two 95.9
adult 94.6
nostalgia 94.3
convertible 93.1
retro 92.9
military 91.9
man 91.1
one 89.6
wagon 88.1
cart 86.3
war 84.7
three 83.5

Imagga
created on 2021-12-13

model t 100
car 100
motor vehicle 64
cart 51.9
vehicle 48.4
wheeled vehicle 44.4
horse cart 41.2
wagon 39.6
wheel 33.2
carriage 32.3
transportation 30.5
transport 30.1
old 27.2
auto 25.8
antique 22.7
road 19.9
rural 18.5
farm 17.8
tractor 17.7
automobile 17.2
machinery 16.6
machine 16.3
drive 16.1
grass 15.8
vintage 15.7
engine 15.4
truck 13.5
field 13.4
outdoor 13
tire 13
wheels 12.7
landscape 12.6
motor 12.6
agriculture 12.3
hay 12.2
outdoors 11.9
work 11.8
driving 11.6
sky 11.5
retro 11.5
speed 11
equipment 10.6
classic 10.2
man 10.1
sport 9.9
travel 9.9
summer 9.6
oxcart 9.5
horse 9.5
farming 9.5
metal 8.8
race 8.6
men 8.6
dirt 8.6
people 8.4
industrial 8.2
bumper 7.9
male 7.8
driver 7.8
broken 7.7
outside 7.7
historic 7.3
dirty 7.2
history 7.2

Microsoft
created on 2021-12-13

outdoor 99.5
land vehicle 97.8
wheel 97.5
road 96.8
vehicle 95.5
tire 83.1
transport 72.6
vintage car 66.3
text 66.1
auto part 58.5
old 53
horse-drawn vehicle 52.5
car 20.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Female, 87%
Happy 85.9%
Calm 12.5%
Sad 0.8%
Disgusted 0.2%
Confused 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Wheel
Person
Car
Wheel 100%
Wheel 99.4%
Wheel 89.6%
Wheel 53.3%
Person 99.3%
Person 99.3%
Person 63.5%
Car 98.1%

Categories

Imagga

paintings art 99.9%

Captions