Human Generated Data

Title

Untitled (reception room for advertising agency)

Date

1948

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20052

Human Generated Data

Title

Untitled (reception room for advertising agency)

People

Artist: Peter James Studio, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20052

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Furniture 99.6
Person 99.4
Shoe 99
Clothing 99
Footwear 99
Apparel 99
Chair 95.6
Person 92.3
Room 91.8
Indoors 91.8
Dining Table 91.7
Table 91.7
Person 78.2
Meal 73.1
Food 73.1
Dining Room 67
Home Decor 66.9
Person 66.3
Lamp 62.6
Living Room 61.5
People 61.3
Restaurant 60
Dish 57.7
Tabletop 55

Clarifai
created on 2023-10-22

people 99.6
group 98.5
chair 98.5
furniture 98
room 96.7
leader 96.4
group together 96.2
table 95.7
indoors 95.4
man 94
administration 91.4
woman 91.2
several 90.7
adult 90.5
seat 89.7
sit 88
wedding 87.6
many 87.5
dining room 82.8
monochrome 80.6

Imagga
created on 2022-03-05

teacher 54.2
room 45.2
person 38.7
professional 38.5
classroom 33.8
man 33.6
people 33.5
educator 33.4
male 33.3
businessman 30
adult 29.1
business 25.5
office 25.4
women 23.7
meeting 23.5
men 22.3
table 21.7
team 21.5
group 21
worker 20.1
businesswoman 20
indoors 19.3
nurse 19.2
work 18.8
smiling 18.8
teamwork 18.5
together 18.4
happy 18.2
corporate 18
sitting 18
executive 17.8
modern 17.5
interior 16.8
lifestyle 16.6
chair 15.7
couple 15.7
job 15
suit 14.4
conference 13.7
patient 13.6
waiter 13.4
talking 13.3
businesspeople 13.3
holding 13.2
success 12.9
restaurant 12.8
home 12.8
portrait 12.3
desk 12.3
cheerful 12.2
life 12.1
manager 12.1
mature 12.1
successful 11.9
indoor 11.9
two 11.9
laptop 11.8
happiness 11.7
working 11.5
hospital 11.4
new 11.3
education 11.3
presentation 11.2
communication 10.9
40s 10.7
smile 10.7
student 10.5
employee 10.5
hall 10.3
cafeteria 10.1
adults 9.5
togetherness 9.4
dining-room attendant 9.2
blackboard 9
family 8.9
discussion 8.8
colleagues 8.7
standing 8.7
board 8.1
handsome 8
computer 8
20 24 years 7.9
seminar 7.9
class 7.7
diversity 7.7
boss 7.7
workplace 7.6
finance 7.6
plan 7.6
school 7.5
enjoyment 7.5
ideas 7.5
looking 7.2
clothing 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

table 96.1
furniture 93.8
text 90.1
clothing 86.3
person 84.6
man 79.3
chair 73.2
black and white 72.8
wedding 60
room 45.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Calm 33.7%
Happy 27.5%
Surprised 26.5%
Sad 7.4%
Fear 2.9%
Angry 0.8%
Disgusted 0.7%
Confused 0.5%

AWS Rekognition

Age 21-29
Gender Male, 69.3%
Sad 40.9%
Calm 31.4%
Confused 21.1%
Angry 2.6%
Happy 1.4%
Disgusted 1.2%
Fear 0.8%
Surprised 0.5%

AWS Rekognition

Age 21-29
Gender Male, 96%
Sad 51.7%
Confused 21.2%
Happy 8.8%
Calm 7.3%
Disgusted 7.2%
Angry 2.2%
Surprised 1%
Fear 0.7%

AWS Rekognition

Age 25-35
Gender Male, 99.4%
Calm 54.3%
Surprised 25.9%
Fear 6.8%
Sad 6.5%
Disgusted 2.2%
Confused 1.9%
Angry 1.5%
Happy 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 99.4%
Person 92.3%
Person 78.2%
Person 66.3%
Shoe 99%

Text analysis

Amazon

32