Human Generated Data

Title

Agricultural School for Prussian Junkers Learning to Hold Reins

Date

1934 (printed 1980)

People

Artist: Alfred Eisenstaedt, German 1898 - 1995

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Lufthansa German Airlines, 1989.69.13

Copyright

Photo by Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images

Human Generated Data

Title

Agricultural School for Prussian Junkers Learning to Hold Reins

People

Artist: Alfred Eisenstaedt, German 1898 - 1995

Date

1934 (printed 1980)

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Lufthansa German Airlines, 1989.69.13

Copyright

Photo by Alfred Eisenstaedt/The LIFE Picture Collection/Getty Images

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.3
Person 98.5
Person 97.3
Wheel 96
Machine 96
Tie 89.6
Accessories 89.6
Accessory 89.6
Face 77.8
Clothing 75.3
Apparel 75.3
People 68.5
Furniture 60.9
Suit 60.2
Coat 60.2
Overcoat 60.2
Leisure Activities 59.1
Tie 57.6
Vehicle 57.1
Transportation 57.1
Photography 56
Photo 56
Person 54.1

Clarifai
created on 2023-10-26

people 100
adult 99
group 98.9
group together 98
man 97.9
three 97.7
two 97.5
music 96.5
several 94.7
five 93.7
musician 93.6
vehicle 93.6
woman 93.2
administration 93.2
leader 93
four 92.3
instrument 90.5
violin 88.3
portrait 87.8
monochrome 86.9

Imagga
created on 2022-01-23

man 37.6
male 28.3
people 26.7
wind instrument 25.8
person 23.6
senior 22.5
business 21.2
brass 21.1
businessman 19.4
trombone 18
musical instrument 17.4
half track 17.2
office 16.8
old 16.7
couple 15.7
happy 15.6
adult 15.2
portrait 14.9
men 14.6
smiling 14.5
computer 14.4
vehicle 14.3
bassoon 14.3
military uniform 14.1
tracked vehicle 14
military vehicle 14
mature 13.9
laptop 13.6
together 13.1
clothing 13.1
sitting 12.9
businesswoman 12.7
uniform 12.7
professional 12.5
indoors 12.3
desk 12.3
face 12.1
oboe 11.7
elderly 11.5
corporate 11.2
work 11
executive 10.8
holding 10.7
businesspeople 10.4
happiness 10.2
suit 9.9
retired 9.7
looking 9.6
home 9.6
statue 9.5
smile 9.3
handsome 8.9
group 8.9
women 8.7
lifestyle 8.7
table 8.6
retirement 8.6
indoor 8.2
one 8.2
playing 8.2
aged 8.1
success 8
working 7.9
color 7.8
older 7.8
room 7.7
casual 7.6
leisure 7.5
manager 7.4
sax 7.4
camera 7.4
conveyance 7.3
black 7.2
history 7.1
worker 7.1
wheeled vehicle 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 97.9
text 92.5
man 91.9
indoor 86.2
clothing 79.4
black and white 68.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 62-72
Gender Male, 100%
Calm 84.1%
Sad 8.5%
Surprised 3%
Confused 1.6%
Fear 1%
Disgusted 0.8%
Angry 0.8%
Happy 0.2%

AWS Rekognition

Age 51-59
Gender Male, 100%
Confused 93.7%
Calm 5.9%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Happy 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 45-51
Gender Male, 99.7%
Confused 49.7%
Disgusted 23.3%
Sad 19%
Angry 2.7%
Calm 2.5%
Fear 1.3%
Surprised 1.1%
Happy 0.4%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Wheel 96%
Tie 89.6%
Suit 60.2%

Categories

Imagga

paintings art 60.4%
people portraits 38.7%