Human Generated Data

Title

Untitled (party in laundromat, woman being pushed in cart)

Date

1957

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15266

Human Generated Data

Title

Untitled (party in laundromat, woman being pushed in cart)

People

Artist: Jack Gould, American

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.7
Human 98.7
Person 98
Person 97.7
Person 93.3
Clothing 89.5
Apparel 89.5
Person 89.3
Face 77.1
Clinic 71.7
People 70.9
Shelter 65.7
Outdoors 65.7
Building 65.7
Rural 65.7
Nature 65.7
Countryside 65.7
Coat 65.4
Female 59.4
Photography 58.9
Photo 58.9

Imagga
created on 2022-03-05

interior 35.4
room 32.1
table 29.4
furniture 27.8
house 25.1
modern 24.5
chair 23.5
home 23.1
window 22.8
decor 19.4
architecture 18.7
design 17.4
light 17.4
luxury 17.1
kitchen 16.5
apartment 15.3
desk 15.2
floor 14.9
3d 14.7
building 14.4
equipment 14.1
indoors 14.1
dining 13.3
empty 12.9
work 12.8
indoor 12.8
wood 12.5
wall 12
elegance 11.8
business 11.5
structure 11.3
metal 11.3
inside 11
glass 10.9
device 10.9
iron lung 10.8
computer 10.7
seat 10.6
render 10.4
contemporary 10.3
decoration 10.2
office 10.2
product 10
dishwasher 9.7
working 9.7
residential 9.6
estate 9.5
people 9.5
domestic 9
cabinet 9
technology 8.9
style 8.9
steel 8.8
man 8.7
respirator 8.6
monitor 8.3
urban 7.9
creation 7.8
white goods 7.8
shop 7.7
comfortable 7.6
tile 7.6
relaxation 7.5
newspaper 7.5
city 7.5
art 7.4
appliance 7.4
person 7.4
home appliance 7.3
furnishing 7.3
digital 7.3
new 7.3
center 7.3
hospital 7.1
day 7.1
wooden 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.4
window 88
indoor 87.2
black and white 85.5
person 65.2
clothing 64.6

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 99.5%
Calm 89.7%
Happy 8.1%
Disgusted 1.2%
Surprised 0.3%
Angry 0.3%
Confused 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Female, 71.1%
Surprised 47.7%
Calm 46.6%
Angry 4.3%
Confused 0.4%
Disgusted 0.4%
Fear 0.3%
Happy 0.2%
Sad 0.2%

AWS Rekognition

Age 22-30
Gender Male, 97.7%
Calm 43.6%
Confused 34.3%
Sad 6.9%
Happy 5.1%
Disgusted 3.6%
Surprised 2.8%
Fear 1.9%
Angry 1.7%

AWS Rekognition

Age 37-45
Gender Male, 84.2%
Happy 65.3%
Sad 10.9%
Disgusted 7.4%
Calm 7.2%
Confused 3.3%
Angry 2.3%
Surprised 2.1%
Fear 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a woman standing in front of a window 61.2%
a group of people standing in front of a window 50.4%
a woman standing next to a window 50.3%

Text analysis

Amazon

MADOM

Google

YT33 A2 HACON
YT33
A2
HACON