Human Generated Data

Title

Untitled (man, woman, and girl looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14615

Human Generated Data

Title

Untitled (man, woman, and girl looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99.2
Person 97.8
Person 96.3
People 91.1
Apparel 88.7
Clothing 88.7
Female 86.2
Person 80.1
Indoors 79.3
Machine 78.7
Girl 74.5
Petal 74
Flower 74
Plant 74
Blossom 74
Room 66.7
Face 65.8
Woman 64.1
Photography 62.7
Photo 62.7
Portrait 61.8
Child 60.5
Kid 60.5
Water 59.9
Pump 55.1

Imagga
created on 2022-01-29

pay-phone 52.3
telephone 48.8
electronic equipment 37.6
equipment 33
device 21.8
home 21.5
interior 21.2
technology 15.6
adult 14.9
kitchen 14.5
modern 14
house 13.4
clean 13.3
people 12.8
person 11.6
indoors 11.4
work 11
room 10.9
medical 10.6
chair 10.4
window 10.3
one 9.7
building 9.7
lifestyle 9.4
industry 9.4
floor 9.3
male 9.2
computer 8.8
light 8.7
research 8.6
communication 8.4
domestic 8.4
inside 8.3
human 8.2
cook 8.2
furniture 8.2
exercise bike 8.2
happy 8.1
appliance 8
smiling 7.9
working 7.9
urban 7.9
food 7.8
face 7.8
dial telephone 7.8
pretty 7.7
biology 7.6
ventilator 7.5
holding 7.4
occupation 7.3
design 7.3
indoor 7.3
laptop 7.3
metal 7.2
seat 7.2
hair 7.1
smile 7.1
science 7.1
information 7.1
medicine 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.7
clothing 90.5
person 88.4
woman 62.9

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 98.9%
Surprised 49.6%
Calm 18.1%
Happy 12.1%
Confused 8.5%
Disgusted 3.7%
Angry 2.7%
Sad 2.7%
Fear 2.5%

AWS Rekognition

Age 30-40
Gender Male, 98.2%
Calm 98.2%
Surprised 1.5%
Angry 0.1%
Sad 0.1%
Disgusted 0.1%
Happy 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 97.9%
Calm 90.3%
Surprised 6.6%
Angry 0.9%
Fear 0.6%
Disgusted 0.5%
Sad 0.5%
Happy 0.4%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 56.2%
Calm 77.4%
Happy 6.2%
Sad 6%
Surprised 2.4%
Disgusted 2.3%
Angry 2.2%
Fear 1.9%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a woman standing in front of a store 80%
a woman standing in front of a building 78.8%
a group of people standing in front of a store 69.7%

Text analysis

Amazon

REMINGTON
QUEEN
FILTER
MJI7
FILTER - QUEEN
DISPA
ST
-
MJI7 YESTAD ОСЛИА
COUP
POST
ОСЛИА
YESTAD

Google

REMINGTON
ST
LOUI
FILTER
MJI7
QUEEN
MJI7 YT3RA2 REMINGTON ST LOUI FILTER QUEEN
YT3RA2