Human Generated Data

Title

Untitled (people sitting on chairs and couches inside house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16452

Human Generated Data

Title

Untitled (people sitting on chairs and couches inside house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16452

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Chair 98.5
Furniture 98.5
Person 98.4
Human 98.4
Person 97.5
Person 97
Person 95.9
Person 94.7
Person 93.7
Room 93.6
Indoors 93.6
Person 89.4
People 86.5
Person 84.1
Person 83.6
Living Room 78.2
Chair 75.6
Couch 64.2
Clinic 61.7
Helmet 61.4
Clothing 61.4
Apparel 61.4
Photography 60.7
Photo 60.7
Screen 59.9
Electronics 59.9
Crowd 59.8
Shorts 58.2
Monitor 58
Display 58
Person 57.5
Person 49.8

Clarifai
created on 2023-10-28

people 99.9
group together 99.1
group 98.7
man 98.1
many 97
adult 96.5
furniture 95.3
chair 95.3
woman 95.2
seat 94.2
sit 94.1
sitting 91.9
education 87.1
several 87.1
leader 86.9
child 86.5
outfit 81.8
room 80.6
recreation 79.8
actor 78.5

Imagga
created on 2022-02-11

classroom 72.8
room 69.1
people 33.5
male 31.2
man 30.9
person 30.7
adult 27.4
office 26.8
businessman 26.5
business 26.1
table 26
group 25
sitting 24.9
indoors 24.6
desk 24.6
women 23.7
teacher 22.2
meeting 21.7
education 20.8
men 20.6
wind instrument 19
work 18.8
smiling 18.8
computer 18.5
home 18.3
brass 18.3
happy 17.5
professional 17.4
laptop 16.7
indoor 16.4
talking 16.2
communication 16
lifestyle 15.9
team 15.2
worker 15.2
businesspeople 15.2
chair 15.2
executive 15.1
manager 14.9
teamwork 14.8
businesswoman 14.5
musical instrument 14.3
smile 14.2
together 14
corporate 13.7
conference 13.7
student 13.6
senior 13.1
couple 13.1
phone 12.9
class 12.5
school 12.5
interior 12.4
working 12.4
cheerful 12.2
mature 12.1
modern 11.9
board 11.8
cornet 11.7
job 11.5
bassoon 11.5
learning 11.3
study 11.2
blackboard 11.2
portrait 11
occupation 11
holding 10.7
colleagues 10.7
studying 10.6
drinking 10.5
looking 10.4
presentation 10.2
casual 10.2
hall 10
hand 9.9
casual clothing 9.8
teaching 9.7
day 9.4
happiness 9.4
friends 9.4
coffee 9.3
successful 9.1
confident 9.1
suit 9
handsome 8.9
technology 8.9
lecture 8.8
educator 8.7
couch 8.7
glass 8.6
adults 8.5
trombone 8.5
friendship 8.4
restaurant 8.4
color 8.3
camera 8.3
success 8
living room 7.8
sofa 7.7
workplace 7.6
training 7.4
wine 7.4
alone 7.3
life 7.3

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 92.6
person 87.6
clothing 77.4
furniture 76.8
chair 54
old 40.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 90.9%
Sad 3.6%
Angry 2.1%
Fear 0.9%
Surprised 0.7%
Happy 0.7%
Confused 0.6%
Disgusted 0.5%

AWS Rekognition

Age 48-56
Gender Male, 99.6%
Calm 98.2%
Sad 1.5%
Angry 0.1%
Happy 0%
Surprised 0%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 41-49
Gender Male, 69.9%
Calm 38.5%
Fear 35.6%
Sad 17.8%
Confused 3.9%
Happy 1.8%
Angry 1.4%
Disgusted 0.6%
Surprised 0.4%

AWS Rekognition

Age 45-53
Gender Male, 73.6%
Calm 92.9%
Happy 2.4%
Sad 2%
Confused 1.5%
Angry 0.6%
Surprised 0.4%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 76.3%
Calm 77.2%
Sad 18.7%
Confused 1.2%
Happy 1.1%
Angry 0.5%
Disgusted 0.5%
Fear 0.4%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair
Person
Helmet
Chair 98.5%
Chair 75.6%
Person 98.4%
Person 97.5%
Person 97%
Person 95.9%
Person 94.7%
Person 93.7%
Person 89.4%
Person 84.1%
Person 83.6%
Person 57.5%
Person 49.8%
Helmet 61.4%

Categories

Imagga

interior objects 98.4%
paintings art 1.1%

Text analysis

Amazon

15

Google

15
15