Human Generated Data

Title

Untitled (man, woman, and girl looking into shop window at mysterious vacuum display)

Date

1948

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14615

Human Generated Data

Title

Untitled (man, woman, and girl looking into shop window at mysterious vacuum display)

People

Artist: Jack Gould, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14615

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99.2
Person 97.8
Person 96.3
People 91.1
Clothing 88.7
Apparel 88.7
Female 86.2
Person 80.1
Indoors 79.3
Machine 78.7
Girl 74.5
Petal 74
Plant 74
Flower 74
Blossom 74
Room 66.7
Face 65.8
Woman 64.1
Photography 62.7
Photo 62.7
Portrait 61.8
Kid 60.5
Child 60.5
Water 59.9
Pump 55.1

Clarifai
created on 2023-10-29

people 99.8
adult 98.3
monochrome 96.7
man 96
woman 94.9
group together 94.7
two 91.7
outfit 89
group 88.6
wear 88.1
facial expression 85.2
administration 84.7
three 84.4
actress 84.3
vehicle 81.9
four 80.2
several 78.9
recreation 77.5
indoors 76.4
retro 75.6

Imagga
created on 2022-01-29

pay-phone 52.3
telephone 48.8
electronic equipment 37.6
equipment 33
device 21.8
home 21.5
interior 21.2
technology 15.6
adult 14.9
kitchen 14.5
modern 14
house 13.4
clean 13.3
people 12.8
person 11.6
indoors 11.4
work 11
room 10.9
medical 10.6
chair 10.4
window 10.3
one 9.7
building 9.7
lifestyle 9.4
industry 9.4
floor 9.3
male 9.2
computer 8.8
light 8.7
research 8.6
communication 8.4
domestic 8.4
inside 8.3
human 8.2
cook 8.2
furniture 8.2
exercise bike 8.2
happy 8.1
appliance 8
smiling 7.9
working 7.9
urban 7.9
food 7.8
face 7.8
dial telephone 7.8
pretty 7.7
biology 7.6
ventilator 7.5
holding 7.4
occupation 7.3
design 7.3
indoor 7.3
laptop 7.3
metal 7.2
seat 7.2
hair 7.1
smile 7.1
science 7.1
information 7.1
medicine 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.7
clothing 90.5
person 88.4
woman 62.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 98.9%
Surprised 49.6%
Calm 18.1%
Happy 12.1%
Confused 8.5%
Disgusted 3.7%
Angry 2.7%
Sad 2.7%
Fear 2.5%

AWS Rekognition

Age 30-40
Gender Male, 98.2%
Calm 98.2%
Surprised 1.5%
Angry 0.1%
Sad 0.1%
Disgusted 0.1%
Happy 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 97.9%
Calm 90.3%
Surprised 6.6%
Angry 0.9%
Fear 0.6%
Disgusted 0.5%
Sad 0.5%
Happy 0.4%
Confused 0.1%

AWS Rekognition

Age 20-28
Gender Female, 56.2%
Calm 77.4%
Happy 6.2%
Sad 6%
Surprised 2.4%
Disgusted 2.3%
Angry 2.2%
Fear 1.9%
Confused 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 99.2%
Person 97.8%
Person 96.3%
Person 80.1%

Categories

Imagga

interior objects 93.9%
paintings art 4.9%

Text analysis

Amazon

REMINGTON
QUEEN
FILTER
MJI7
FILTER - QUEEN
DISPA
ST
-
MJI7 YESTAD ОСЛИА
COUP
POST
ОСЛИА
YESTAD

Google

MJI7 YT3RA2 REMINGTON ST LOUI FILTER QUEEN
MJI7
YT3RA2
REMINGTON
ST
LOUI
FILTER
QUEEN