Human Generated Data

Title

[Men and truck]

Date

1930-1935

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.303.17

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Men and truck]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.9
Human 99.9
Person 99.1
Bicycle 98.7
Bike 98.7
Transportation 98.7
Vehicle 98.7
Person 97
Person 96.9
Machine 96.5
Wheel 96.5
Wheel 95.5
Person 92.7
Car 91.7
Automobile 91.7
Person 91.6
Clothing 82.7
Apparel 82.7
Wheel 77.7
Coat 72
Overcoat 72
Antique Car 64.1
Model T 64.1
Truck 58.8

Clarifai
created on 2019-05-29

people 100
vehicle 99.6
group 99.4
group together 99.1
adult 98.5
transportation system 97.3
several 96.8
man 96.2
many 94.7
administration 94.6
woman 94
four 92.7
leader 92.1
child 90.2
wear 89.9
three 89
five 88.2
two 87.8
military 86.4
war 86.3

Imagga
created on 2019-05-29

passenger 26.7
old 25.1
groom 18.2
architecture 18
people 16.7
building 16
person 15.8
car 15.3
statue 15.2
history 14.3
man 14.1
tourism 14
ancient 13.8
wheelchair 13.8
chair 13.1
monument 13.1
historic 12.8
dress 12.6
sculpture 12.4
male 12.1
travel 12
culture 12
men 11.2
adult 11
stone 11
world 10.9
wheeled vehicle 10.9
landmark 10.8
traditional 10.8
bride 10.7
marble 10.6
couple 10.4
art 10.4
vehicle 10.3
happiness 10.2
motor vehicle 9.7
tourist 9.7
home 9.6
street 9.2
wedding 9.2
house 9.2
city 9.1
portrait 9.1
road 9
religion 9
happy 8.8
truck 8.5
patient 8.5
two 8.5
church 8.3
park 8.2
transportation 8.1
celebration 8
smiling 8
love 7.9
black 7.8
palace 7.7
married 7.7
mother 7.6
nurse 7.6
senior 7.5
famous 7.4
care 7.4
room 7.4
seat 7.3
smile 7.1
hospital 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

person 97.4
land vehicle 94.8
old 93.1
bicycle 92.6
outdoor 92.1
vehicle 91.8
wheel 89.4
posing 81.9
clothing 76.8
people 61.1
man 51.6
vintage 30.9

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.7%
Happy 45.9%
Disgusted 46.5%
Surprised 45.7%
Confused 45.7%
Sad 47.2%
Angry 45.7%
Calm 48.4%

AWS Rekognition

Age 23-38
Gender Female, 53.3%
Angry 45.6%
Sad 46.1%
Surprised 46.5%
Confused 45.3%
Happy 47.7%
Calm 47.6%
Disgusted 46.3%

AWS Rekognition

Age 35-52
Gender Female, 53.8%
Angry 46.6%
Calm 45.9%
Confused 45.8%
Disgusted 47.2%
Happy 47%
Sad 46.4%
Surprised 46%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Angry 45.6%
Surprised 47%
Confused 46%
Calm 49.7%
Happy 45.3%
Disgusted 45.4%
Sad 46.1%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Sad 45.5%
Happy 45.4%
Angry 45.4%
Confused 46.7%
Surprised 47%
Disgusted 45.3%
Calm 49.7%

Feature analysis

Amazon

Person 99.9%
Bicycle 98.7%
Wheel 96.5%
Truck 58.8%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 97%
a vintage photo of a group of people posing for a picture 96.9%
an old photo of a group of people posing for the camera 96.8%

Text analysis

Amazon

+
heme
mok heme
mok