Human Generated Data

Title

Untitled (people standing outside of car)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15533

Human Generated Data

Title

Untitled (people standing outside of car)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.5
Person 99.5
Person 98.1
Clothing 94.2
Apparel 94.2
Person 92.1
Machine 90.2
Wheel 90.2
Person 89.2
Vehicle 87.5
Car 87.5
Automobile 87.5
Transportation 87.5
Shorts 75.1
Door 74.4
Person 69.7
Tire 60.2
Hat 55.3

Imagga
created on 2022-02-05

home appliance 33.9
appliance 27.9
television 27.5
white goods 24.6
person 23
people 22.3
home 21.5
kitchen appliance 17.2
telecommunication system 17.2
room 16.9
car 16.1
computer 16
adult 15.5
office 15.3
man 14.8
dishwasher 14.7
interior 14.1
working 14.1
business 14
microwave 13.9
monitor 13.5
work 13.3
male 12.8
happy 12.5
equipment 12
technology 11.9
durables 11.7
house 11.7
indoors 11.4
black 11.4
refrigerator 11
washer 10.8
smile 10.7
worker 10.7
laptop 10.6
job 10.6
portrait 10.4
screen 10.2
communication 10.1
device 9.8
attractive 9.8
to 9.7
new 9.7
sitting 9.4
child 9.4
vehicle 9.3
inside 9.2
modern 9.1
machine 9
transportation 9
businessman 8.8
looking 8.8
automobile 8.6
keyboard 8.5
holding 8.2
transport 8.2
smiling 8
lifestyle 7.9
men 7.7
auto 7.7
electronic equipment 7.6
hand 7.6
toaster 7.6
clothes 7.5
future 7.4
security 7.3
back 7.3
lady 7.3
success 7.2
kitchen 7.2
family 7.1
face 7.1
newspaper 7.1
hospital 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 97
vehicle 91.9
outdoor 87.8
clothing 85.6
car 84.3
land vehicle 82.8
person 81.1
black and white 77.9
posing 62.8
man 55.5

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 93.9%
Happy 46.2%
Sad 23.4%
Calm 20.2%
Disgusted 4.6%
Fear 2.3%
Angry 1.4%
Confused 0.9%
Surprised 0.9%

AWS Rekognition

Age 23-31
Gender Female, 56.8%
Calm 85.2%
Happy 7.1%
Surprised 2.3%
Sad 1.9%
Confused 1.8%
Disgusted 0.8%
Fear 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Wheel 90.2%
Car 87.5%

Captions

Microsoft

a group of people posing for a photo 88.2%
a group of people posing for the camera 88.1%
a group of people posing for a picture 88%