Human Generated Data

Title

Untitled (man and two women in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17152

Human Generated Data

Title

Untitled (man and two women in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17152

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Person 99.1
Person 98.1
Furniture 98
Indoors 91.8
Room 91.4
People 82.4
Interior Design 79.4
Living Room 76.7
Face 72.5
Baby 64.4
Chair 63
Bedroom 55.8
Bed 55.7
Clinic 55.3
Person 47.6

Clarifai
created on 2023-10-29

people 99.8
group 99.3
child 97.8
adult 96.5
man 95.6
group together 95.4
room 95.1
furniture 94.8
woman 94.6
family 94.4
several 92.8
sit 91.9
leader 89.7
three 89.7
two 88.8
home 87.5
four 87.2
offspring 86.7
son 86.2
chair 85.7

Imagga
created on 2022-02-26

interior 23
room 22.7
window 22.5
people 21.7
sketch 21.5
home 21.5
drawing 20.4
house 20
table 16.3
person 15.3
man 14.8
architecture 14
building 13.9
furniture 13.7
women 13.4
adult 13
indoor 12.8
city 12.5
indoors 12.3
men 12
modern 11.9
winter 11.9
representation 11.8
snow 11.6
male 11.4
group 11.3
sitting 11.2
shop 10.8
urban 10.5
chair 10.2
team 9.8
family 9.8
cheerful 9.7
business 9.7
portrait 9.7
design 9.6
professional 9.5
structure 9.4
smile 9.3
negative 9.2
working 8.8
happy 8.8
smiling 8.7
work 8.6
hall 8.6
elegant 8.6
living 8.5
barbershop 8.4
office 8.3
inside 8.3
human 8.2
style 8.2
new 8.1
life 8
decor 8
lifestyle 7.9
scene 7.8
luxury 7.7
wall 7.7
decoration 7.6
fashion 7.5
holding 7.4
light 7.3
worker 7.2
activity 7.2
kin 7.2
mercantile establishment 7.1
balcony 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.9
house 93.4
table 85.4
furniture 79.8
vase 74.6
clothing 72
drawing 64.7
person 64
room 47.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 89%
Happy 56.9%
Sad 35.3%
Calm 2.4%
Surprised 2%
Angry 1.4%
Disgusted 0.8%
Fear 0.6%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.1%
Person 98.1%
Person 47.6%

Categories

Imagga

interior objects 92.1%
paintings art 7.7%

Text analysis

Amazon

20
AEI
KODAK-EVEELA

Google

20
20