Human Generated Data

Title

Untitled (elderly woman in sailor hat holding baby)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15540.2

Human Generated Data

Title

Untitled (elderly woman in sailor hat holding baby)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 97.8
Person 97.8
Furniture 93.2
Transportation 80.7
Train 80.7
Vehicle 80.7
Bus 66.3
Cafe 65.1
Restaurant 65.1
Person 62.9
Meal 59
Food 59
Chair 58.7
Sitting 56.1
Cafeteria 55.9
Bed 55.5

Imagga
created on 2022-02-05

passenger 93.7
car 40.9
vehicle 39.4
transportation 33.2
conveyance 23.8
transport 21
man 20.8
train 20.6
travel 19
people 19
wheeled vehicle 18.7
automobile 17.2
passenger car 16.4
adult 16.2
ambulance 15.8
auto 15.3
inside 14.7
tramway 14
bus 13.8
happy 13.8
male 13.5
public transport 13.1
smiling 13
truck 12.2
business 12.1
person 12.1
equipment 11.4
industry 11.1
emergency 10.6
indoors 10.5
modern 10.5
sitting 10.3
subway train 10.2
road 9.9
motor vehicle 9.9
interior 9.7
driver 9.7
metal 9.7
hospital 9.5
drive 9.5
work 9.4
lifestyle 9.4
life 9.4
help 9.3
health 9
shop 9
working 8.8
medical 8.8
station 8.7
happiness 8.6
fast 8.4
color 8.3
service 8.3
care 8.2
room 8.2
barbershop 8.1
smile 7.8
standing 7.8
portrait 7.8
motor 7.7
insurance 7.7
driving 7.7
men 7.7
old 7.7
two 7.6
traffic 7.6
machine 7.6
wheel 7.5
senior 7.5
city 7.5
outdoors 7.5
door 7.5
technology 7.4
safety 7.4
new 7.3
patient 7.2
steel 7.1

Microsoft
created on 2022-02-05

text 97.7
vehicle 93
land vehicle 92.3
black and white 91.6
outdoor 85.3
white goods 72.7
wheel 68.4
street 51.2
bus 50.8

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.4%
Calm 96.2%
Surprised 1.3%
Happy 1.2%
Disgusted 0.4%
Sad 0.3%
Confused 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 94.5%
Surprised 63%
Happy 20.5%
Fear 7.6%
Disgusted 2.9%
Angry 2%
Calm 1.8%
Confused 1.3%
Sad 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Train 80.7%
Bus 66.3%

Captions

Microsoft

a group of people standing in front of a store window 55.6%
a group of people in front of a store window 55.5%
a person standing in front of a store window 55.4%

Text analysis

Amazon

Elder's
DA

Google

Flders
Flders