Human Generated Data

Title

Untitled (girl sitting on couch)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17401

Human Generated Data

Title

Untitled (girl sitting on couch)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17401

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.9
Couch 99.5
Person 97.5
Human 97.5
Cushion 88
Home Decor 87.4
Chair 86
Clothing 81.8
Apparel 81.8
Bed 78
Face 69.3
Pillow 67.1
Female 65.3
Finger 59.9
Table 57.8

Clarifai
created on 2023-10-29

people 100
one 99.3
adult 98.8
man 97.5
portrait 96.8
leader 96.7
wear 96.3
music 95
outfit 93.9
facial hair 91.9
musician 91
administration 90.3
exploration 88.4
print 86.5
chair 86.3
military 85.5
two 84.7
art 83.2
scientist 83.2
war 81.1

Imagga
created on 2022-02-26

man 31.6
person 30.3
male 29.8
people 20.6
gun 20.6
uniform 20.1
negative 19.7
world 19.5
mask 17
adult 16.9
soldier 16.6
military 16.4
clothing 15.7
film 15.7
sport 14.8
protection 14.5
danger 14.5
device 13.2
rifle 12.9
human 12.7
weapon 12.7
military uniform 12.5
fun 12
safety 12
industry 11.9
playing 11.8
industrial 11.8
megaphone 11.7
portrait 11.6
war 11.6
skill 11.5
photographic paper 11.4
player 11.3
fashion 11.3
camouflage 10.9
equipment 10.9
disaster 10.7
body 10.4
black 10.2
work 10.2
competition 10.1
brass 10
worker 9.8
toxic 9.8
army 9.7
gas 9.6
acoustic device 9.5
future 9.3
training 9.2
game 8.9
symbol 8.7
athlete 8.7
professional 8.6
wind instrument 8.6
smoke 8.4
leisure 8.3
covering 8.2
environment 8.2
suit 8.1
sexy 8
radioactive 7.8
radiation 7.8
protective 7.8
men 7.7
modern 7.7
photographic equipment 7.7
protect 7.7
power 7.5
technology 7.4
style 7.4
action 7.4
event 7.4
business 7.3
art 7.3
dirty 7.2
sax 7.2
engineer 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.5
book 94.2
clothing 66.1
person 63.2
black and white 61.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 56.4%
Calm 66.1%
Happy 15.1%
Sad 7.3%
Angry 4.4%
Confused 3.1%
Surprised 2%
Fear 1.2%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.5%

Categories

Captions

Microsoft
created on 2022-02-26

a man holding a book 38.1%
an old photo of a man 38%
a man sitting on a book 26.9%

Text analysis

Amazon

MJIR
YT3RAS
MJIR YT3RAS ALSMA
me
ALSMA

Google

MJIR YT3RA20
MJIR
YT3RA20