Human Generated Data

Title

Untitled (children in front of fireplace)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17042

Human Generated Data

Title

Untitled (children in front of fireplace)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17042

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.6
Human 98.6
Floor 98.3
Person 96.9
Person 96.7
Flooring 95.7
Clothing 82.8
Apparel 82.8
Sleeve 82
Face 78.5
Furniture 74
Lighting 72.7
Wood 70.1
Kid 68.3
Child 68.3
Indoors 66.9
Door 62.7
Long Sleeve 62.3
Hardwood 60.7
Portrait 60.3
Photography 60.3
Photo 60.3
Chair 59.6
Room 59.6
Female 55.8
Girl 55.4

Clarifai
created on 2023-10-28

people 99.9
group 98.3
child 97.5
room 96.6
three 96
education 95.9
group together 95.7
two 95.3
four 94.5
adult 93.8
man 92.8
school 91.8
indoors 91.2
portrait 89.9
family 89.7
leader 88.9
woman 87.9
uniform 87.9
outfit 86.8
administration 86.1

Imagga
created on 2022-02-26

man 32.2
people 28.4
adult 27.2
person 25.9
male 22.8
teacher 19
city 17.5
walking 17
women 16.6
ball 16.1
sport 15.7
chair 14.9
men 14.6
business 14.6
lifestyle 14.5
professional 14
educator 14
urban 14
exercise 13.6
portrait 13.6
fitness 13.5
world 13.5
wheelchair 13.1
human 12.7
casual 12.7
day 12.6
motion 12
trainer 11.6
walk 11.4
happy 11.3
shop 10.5
group 10.5
standing 10.4
sitting 10.3
two 10.2
life 10
indoor 10
worker 9.9
activity 9.9
tennis 9.7
working 9.7
businessman 9.7
equipment 9.7
indoors 9.7
seat 9.5
work 9.4
clothing 9.4
wall 9.4
light 9.4
floor 9.3
leisure 9.1
silhouette 9.1
building 9.1
health 9
interior 8.8
court 8.8
crowd 8.6
blurred 8.6
outside 8.6
outdoor 8.4
blur 8.4
action 8.3
street 8.3
room 8.3
speed 8.2
alone 8.2
outdoors 8.2
transportation 8.1
recreation 8.1
game equipment 8
rush 7.9
athlete 7.8
travel 7.7
racket 7.7
healthy 7.6
basketball 7.5
friends 7.5
furniture 7.5
holding 7.4
training 7.4
fit 7.4
lady 7.3
playing 7.3
smiling 7.2
black 7.2
looking 7.2
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.6
person 92.2
floor 91.9
window 86.2
clothing 84.4
gallery 75.8
human face 71.1
toddler 70.8
boy 62.2
room 59.3
black and white 53.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.7%
Sad 50.8%
Calm 30.1%
Happy 7.5%
Confused 5.4%
Disgusted 2.3%
Angry 1.6%
Surprised 1.4%
Fear 0.9%

AWS Rekognition

Age 34-42
Gender Male, 96.8%
Calm 61%
Happy 24.1%
Sad 4.7%
Angry 3.2%
Surprised 2.9%
Confused 1.5%
Disgusted 1.5%
Fear 1.2%

AWS Rekognition

Age 48-54
Gender Male, 90.6%
Sad 83.8%
Calm 7.7%
Confused 3.4%
Disgusted 2.6%
Happy 1.1%
Angry 0.7%
Surprised 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 98.6%
Person 96.9%
Person 96.7%
Chair 59.6%

Categories

Text analysis

Amazon

88
KODAK-EVEELA

Google

YT37A2-XAGOX
YT37A2-XAGOX