Human Generated Data

Title

Untitled (three women in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17213

Human Generated Data

Title

Untitled (three women in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17213

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 98.7
Chair 97
Furniture 97
Clothing 92.9
Apparel 92.9
Worker 80.5
People 75.4
Female 75.1
Sitting 74.5
Face 72.2
Hairdresser 68.7
Photography 66.4
Photo 66.4
Workshop 65.1
Portrait 64.3
Meal 63.4
Food 63.4
Person 63.1
Person 62.6
Indoors 61.4
Room 61.3
Girl 60.9
Flooring 59.5
Woman 58.2

Clarifai
created on 2023-10-29

people 99.9
group 99.1
furniture 99.1
adult 98.1
group together 97.7
home 97.6
room 97
man 95.8
seat 95.3
woman 94.8
two 93.4
wear 93.4
administration 93.2
child 91.9
chair 90.5
many 90.5
several 88.9
three 88.7
actress 88.2
four 86.5

Imagga
created on 2022-02-26

musical instrument 94.1
accordion 83.8
keyboard instrument 67.3
wind instrument 61.8
man 32.9
people 29.6
adult 25
male 20.6
person 20.3
business 18.8
lifestyle 17.3
women 16.6
smiling 15.2
indoors 14.9
couple 14.8
businessman 14.1
sitting 13.7
concertina 13.5
room 13.5
holding 13.2
together 13.1
men 12.9
two 12.7
office 12.3
happy 11.9
casual 11.9
free-reed instrument 11.8
chair 11.7
interior 11.5
working 11.5
boy 11.3
portrait 11
music 10.9
handsome 10.7
group 10.5
indoor 10
playing 10
modern 9.8
attractive 9.8
fun 9.7
home 9.6
play 9.5
corporate 9.4
enjoyment 9.4
musician 9.2
businesswoman 9.1
fashion 9
family 8.9
urban 8.7
love 8.7
smile 8.5
black 8.4
mature 8.4
city 8.3
leisure 8.3
inside 8.3
sexy 8
computer 8
day 7.8
happiness 7.8
education 7.8
teacher 7.7
motion 7.7
hand 7.6
businesspeople 7.6
student 7.5
dark 7.5
classroom 7.5
outdoors 7.5
instrument 7.4
20s 7.3
window 7.3
cheerful 7.3
teenager 7.3
exercise 7.3
suit 7.2
body 7.2
active 7.2
work 7.2
to 7.1
job 7.1
child 7

Google
created on 2022-02-26

Black 89.5
Black-and-white 87.1
Table 84
Style 84
Plant 83.5
Font 78.2
Tints and shades 76.9
Monochrome 76.8
Curtain 76.4
Window 75.8
Monochrome photography 75.7
Room 72.7
Chair 71.5
Event 70.2
Houseplant 69.1
Rectangle 67.7
Picture frame 66.4
Stock photography 64.2
Sitting 63.7
Art 63.3

Microsoft
created on 2022-02-26

text 90.1
indoor 87.2
furniture 68.7
christmas tree 67.6
house 52.5
dining table 8.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 81.7%
Calm 96.7%
Surprised 1.1%
Sad 0.8%
Confused 0.7%
Disgusted 0.4%
Happy 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 67.8%
Calm 81.5%
Surprised 8.8%
Sad 5.4%
Confused 1.3%
Disgusted 1.1%
Happy 0.7%
Angry 0.6%
Fear 0.6%

AWS Rekognition

Age 45-53
Gender Male, 74.1%
Calm 94.6%
Happy 4.8%
Surprised 0.3%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.6%
Person 98.7%
Person 63.1%
Person 62.6%
Chair 97%

Text analysis

Amazon

KODAK-SEA