Human Generated Data

Title

Untitled (six young men and women posed sitting and laughing in living room)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9375

Human Generated Data

Title

Untitled (six young men and women posed sitting and laughing in living room)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9375

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 98.5
Sitting 98.2
Person 98.1
Person 96.1
Person 96
Person 95.3
Furniture 88.4
Couch 87.4
Indoors 82
Clothing 80
Apparel 80
Room 76.8
People 67.3
Living Room 66.8
Senior Citizen 65.1
Suit 61.6
Coat 61.6
Overcoat 61.6
Table 59.1
Flower 57.2
Plant 57.2
Blossom 57.2
Face 55.8
Person 53

Clarifai
created on 2023-10-26

people 99.7
group 98.2
woman 97.5
adult 96.8
man 95.8
monochrome 95.1
chair 93.7
leader 92.4
sit 90.5
many 88
administration 87.6
group together 87.4
education 86
indoors 85.1
furniture 84.7
actor 83.9
music 82.6
child 81.2
room 79.9
league 79.1

Imagga
created on 2022-01-23

room 41.2
person 37.6
man 36.3
people 35.7
classroom 35.2
male 31.3
businessman 30.9
office 30.7
business 30.4
meeting 30.2
teacher 29.7
professional 29.6
table 24.8
smiling 24.6
group 24.2
adult 23.3
indoors 22.8
team 22.4
happy 21.9
home 21.5
sitting 21.5
teamwork 21.3
work 21.2
businesswoman 20.9
men 20.6
desk 18.5
women 17.4
corporate 17.2
executive 16.8
working 16.8
educator 16.7
chair 16.7
lifestyle 15.9
laptop 15.6
computer 15.2
smile 15
conference 14.7
suit 14.4
businesspeople 14.2
job 14.2
interior 14.2
together 14
presentation 14
indoor 13.7
colleagues 13.6
happiness 13.3
holding 13.2
cheerful 13
education 13
salon 13
worker 12.9
modern 12.6
talking 12.4
couple 12.2
student 11.8
communication 11.8
workplace 11.4
restaurant 11.3
director 11.2
board 11
life 10.3
shop 10.3
mature 10.2
portrait 9.7
success 9.7
two 9.3
manager 9.3
document 9.3
coffee 9.3
house 9.2
blackboard 9.1
kitchen 8.9
new 8.9
technology 8.9
family 8.9
kid 8.9
diverse 8.8
students 8.8
boy 8.7
partner 8.7
standing 8.7
school 8.7
class 8.7
exam 8.6
study 8.4
service 8.3
successful 8.2
girls 8.2
confident 8.2
cup 8.1
looking 8
to 8
teaching 7.8
color 7.8
two people 7.8
attractive 7.7
diversity 7.7
reading 7.6
adults 7.6
friends 7.5
learning 7.5
enjoyment 7.5
dinner 7.3
hall 7.3
occupation 7.3
20s 7.3
lady 7.3
handsome 7.1

Google
created on 2022-01-23

Chair 90
Black 89.7
Black-and-white 86.1
Style 84
Font 80.9
Suit 75.6
Art 74.7
Monochrome photography 74.6
Snapshot 74.3
Monochrome 73.3
Event 72.4
Table 70.9
Design 68.5
Room 67.2
Stock photography 62.4
History 59.9
Vintage clothing 59.6
Sitting 56.9
Classic 55
Team 54.6

Microsoft
created on 2022-01-23

table 96.7
text 95.1
person 94.8
clothing 94.1
man 86.8
furniture 67.7
woman 60.6
group 59.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 98.7%
Sad 65%
Happy 19%
Confused 8.7%
Surprised 1.9%
Calm 1.9%
Disgusted 1.3%
Fear 1.1%
Angry 1.1%

AWS Rekognition

Age 50-58
Gender Female, 99.9%
Happy 95.4%
Confused 2.8%
Calm 0.8%
Sad 0.3%
Surprised 0.3%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Female, 98.8%
Happy 95%
Calm 2.5%
Sad 0.9%
Confused 0.6%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 25-35
Gender Male, 92.4%
Calm 61.8%
Sad 20.5%
Surprised 6.8%
Disgusted 3.5%
Confused 3.4%
Fear 2.1%
Happy 0.9%
Angry 0.8%

AWS Rekognition

Age 37-45
Gender Male, 95.9%
Confused 54.4%
Happy 17.5%
Sad 12.7%
Surprised 7.2%
Calm 2.9%
Fear 2%
Disgusted 1.7%
Angry 1.5%

AWS Rekognition

Age 22-30
Gender Male, 99.4%
Sad 81.9%
Confused 11.9%
Surprised 3%
Disgusted 0.8%
Angry 0.7%
Happy 0.7%
Fear 0.6%
Calm 0.4%

AWS Rekognition

Age 23-33
Gender Male, 98.1%
Calm 49.6%
Fear 22.5%
Disgusted 10%
Confused 7.5%
Angry 4%
Happy 2.8%
Sad 2%
Surprised 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

veuss
8a
YТ37А-MX

Google

verss
verss