Human Generated Data

Title

Untitled (men and women at party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17083

Human Generated Data

Title

Untitled (men and women at party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17083

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.7
Human 99.7
Chair 99.6
Furniture 99.6
Clothing 98.6
Apparel 98.6
Person 98.3
Chair 96
Person 93.8
Person 92.7
Chair 87.9
Person 86.7
Chair 84
Female 78.4
Person 76.6
Suit 76.6
Coat 76.6
Overcoat 76.6
Person 76.4
Person 69.3
Indoors 69.3
Room 68.9
Person 68.1
Woman 66.9
Shorts 60.6
Leisure Activities 58.6
Outdoors 57

Clarifai
created on 2023-10-29

people 99.8
group together 98.2
group 97.8
adult 97.2
furniture 96.7
man 96.1
administration 95.5
woman 95.4
room 94
chair 93.8
two 91.2
monochrome 90.6
wear 89.8
four 87.6
several 85.3
seat 85.2
three 84.1
recreation 84.1
home 84
child 83.7

Imagga
created on 2022-02-26

chair 45.5
table 39.2
room 36.9
restaurant 28.9
interior 27.4
classroom 23.9
chairs 21.5
cafeteria 20.6
furniture 19.6
house 19.2
people 18.4
sitting 17.2
man 16.8
building 16.4
business 15.8
person 15.3
hall 15.3
floor 14.9
seat 14.8
modern 14.7
home 14.4
office 14.3
structure 14.1
inside 13.8
patio 13.5
male 13.5
indoors 13.2
window 12.9
wood 12.5
dining 12.4
lifestyle 12.3
women 11.9
design 11.8
tables 11.8
relaxation 11.7
glass 11.7
work 11.3
urban 10.5
outdoors 10.5
musical instrument 10.1
drink 10
group 9.7
style 9.6
dinner 9.6
empty 9.4
meeting 9.4
day 9.4
coffee 9.3
city 9.1
engineer 9
teacher 8.9
decor 8.8
businessman 8.8
wall 8.7
architecture 8.6
comfortable 8.6
food 8.6
communication 8.4
worker 8.3
silhouette 8.3
indoor 8.2
team 8.1
education 7.8
class 7.7
kitchen 7.6
contemporary 7.5
plant 7.5
vacation 7.4
lunch 7.3
businesswoman 7.3
sun 7.2
board 7.2
area 7.1
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 93.1
outdoor 90
chair 81.9
furniture 81.2
piano 75.9
person 74.7
clothing 71.7
musical instrument 56.8
man 53.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 95.5%
Calm 94.4%
Surprised 1.8%
Confused 1.2%
Happy 0.9%
Sad 0.9%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99.7%
Person 98.3%
Person 93.8%
Person 92.7%
Person 86.7%
Person 76.6%
Person 76.4%
Person 69.3%
Person 68.1%
Chair 99.6%
Chair 96%
Chair 87.9%
Chair 84%