Human Generated Data

Title

Untitled (boy and man playing with model train)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17734

Human Generated Data

Title

Untitled (boy and man playing with model train)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.5
Person 99.5
Person 99
Game 84.6
Gambling 76.7

Imagga
created on 2022-02-26

man 30.2
people 25.6
adult 24
person 23.9
male 22.7
car 18.5
vehicle 16.4
lifestyle 15.9
outdoors 15.7
men 14.6
happy 13.8
automobile 13.4
machinist 13.2
device 12.9
happiness 12.5
smiling 12.3
smile 12.1
home 12
transportation 11.6
attractive 11.2
worker 10.8
portrait 10.3
work 10.3
women 10.3
day 10.2
casual 10.2
cheerful 9.7
auto 9.6
equipment 9.5
color 9.4
city 9.1
transport 9.1
health 9
human 9
technology 8.9
working 8.8
indoors 8.8
standing 8.7
water 8.7
outside 8.5
bobsled 8.5
business 8.5
horizontal 8.4
house 8.4
hand 8.4
clean 8.3
occupation 8.2
machine 8
conveyance 7.9
tool 7.9
urban 7.9
couple 7.8
boy 7.8
luxury 7.7
motion 7.7
engine 7.7
professional 7.7
active 7.6
adults 7.6
side 7.5
fun 7.5
patient 7.4
case 7.3
looking 7.2
summer 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 90.5
text 85.6
clothing 61.4
black and white 54.9
man 52.4

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Male, 79.9%
Surprised 71.1%
Calm 22.2%
Angry 2.9%
Happy 1.3%
Fear 0.9%
Disgusted 0.7%
Sad 0.6%
Confused 0.2%

AWS Rekognition

Age 21-29
Gender Male, 90.3%
Sad 67%
Calm 32%
Confused 0.3%
Angry 0.2%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a man sitting on a cutting board 38%
a man sitting on a table 37.9%
a man sitting at a table 37.8%