Human Generated Data

Title

Untitled (woman holding child on lap)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17363

Human Generated Data

Title

Untitled (woman holding child on lap)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17363

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.6
Apparel 99.6
Person 99.5
Human 99.5
Face 93.8
Female 92.4
Furniture 88.3
Chair 85.4
Shoe 84
Footwear 84
Hair 77.9
Portrait 76.7
Photography 76.7
Photo 76.7
Woman 76.6
Dress 72.8
Girl 69.2
Fashion 65
People 64.6
Gown 63.3
Smile 62.7
Hat 62.1
Couch 61.6
Robe 59.9
Sitting 58.6
Wedding 58.2
Kid 57.4
Child 57.4
Wedding Gown 57.2
Shoe 57.1
Suit 57.1
Coat 57.1
Overcoat 57.1

Clarifai
created on 2023-10-28

people 99.8
music 99.5
adult 98.9
monochrome 98.9
musician 98.8
man 98.6
two 98.6
singer 97.7
woman 97.2
group 96
portrait 94.7
microphone 93.8
one 92.5
actress 91.1
chair 90
sit 89.7
three 89.3
art 89.1
sitting 88.6
child 88.3

Imagga
created on 2022-02-26

mask 45.7
person 28.8
male 26.9
man 26.9
protective covering 24.3
device 21.3
people 21.2
adult 19.5
covering 19.4
black 17.4
human 16.5
body 15.2
face 14.9
hair 12.7
modern 12.6
technology 12.6
style 12.6
portrait 12.3
photographer 11.6
studio 11.4
hand 11.4
network 11.1
microphone 11
suit 10.8
costume 10.6
military 10.6
war 10.6
fashion 10.6
future 10.2
protection 10
music 10
holding 9.9
projector 9.8
groom 9.8
women 9.5
men 9.4
lifestyle 9.4
expression 9.4
connection 9.1
art 9.1
danger 9.1
make 9.1
sexy 8.8
conceptual 8.8
soldier 8.8
light 8.7
business 8.5
skin 8.5
communication 8.4
attractive 8.4
oxygen mask 8.3
dark 8.3
one 8.2
information 8
businessman 7.9
love 7.9
toxic 7.8
horror 7.8
nuclear 7.8
sitting 7.7
chemical 7.7
gas 7.7
pollution 7.7
gesture 7.6
power 7.6
environment 7.4
computer 7.3
sensuality 7.3
futuristic 7.2
screen 7.1
equipment 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 94.8
text 88.8
cartoon 75.3
black and white 62.3
clothing 56.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 100%
Happy 39.9%
Calm 28.9%
Surprised 21.2%
Fear 3.8%
Sad 2.4%
Confused 1.4%
Disgusted 1.2%
Angry 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.5%
Shoe 84%
Shoe 57.1%

Categories

Captions

Microsoft
created on 2022-02-26

a person posing for the camera 82.2%
a man sitting in a chair 61.6%
a man holding a gun 40.5%

Text analysis

Amazon

اغ