Human Generated Data

Title

Untitled (women kneeling and standing in front of altar for Masonic ceremony)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9359

Human Generated Data

Title

Untitled (women kneeling and standing in front of altar for Masonic ceremony)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 97.7
Person 97.7
Person 96.3
Apparel 94.3
Clothing 94.3
Person 93.7
Person 93.6
Person 88.6
Person 85.8
Furniture 84.9
Coat 80.9
Suit 80.9
Overcoat 80.9
Bed 79.3
Indoors 79.2
Crowd 78.9
Person 77.7
Room 74.7
People 68
Person 67.5
Person 61.1
Living Room 55.8
Person 44.8
Person 43.2

Imagga
created on 2022-01-23

office 57.4
businessman 53
business 51
metropolitan 48.5
man 43.7
laptop 43.4
executive 39.9
computer 37.7
male 37.6
professional 36.7
corporate 35.2
people 35.1
work 33
meeting 32
working 30
person 29.5
businesspeople 29.4
businesswoman 29.1
team 27.8
adult 26.2
table 26.1
desk 25.6
suit 25.6
group 25
sitting 24.1
happy 23.8
communication 23.5
teamwork 23.2
job 21.2
manager 20.5
smiling 20.3
talking 20
colleagues 19.4
confident 19.1
men 18.9
indoors 18.5
notebook 18.3
together 17.5
workplace 17.2
smile 17.1
success 16.9
looking 16.8
discussion 16.5
successful 16.5
teacher 15.7
indoor 15.5
corporation 15.4
worker 15.1
partners 14.6
career 14.2
modern 14
handsome 13.4
room 13
education 13
portrait 12.9
discussing 12.8
casual 12.7
busy 12.5
employee 12.4
mature 12.1
glasses 12
women 11.9
finance 11.8
coworkers 11.8
partnership 11.5
horizontal 10.9
explaining 10.8
hand 10.6
mid adult 10.6
project 10.6
presentation 10.2
company 10.2
lifestyle 10.1
face 9.9
businessmen 9.8
cooperation 9.7
technology 9.6
couple 9.6
serious 9.5
tie 9.5
pen 9.4
lawyer 9
director 9
cheerful 8.9
conference 8.8
conversation 8.7
employment 8.7
hands 8.7
30s 8.7
day 8.6
happiness 8.6
formal 8.6
females 8.5
keyboard 8.4
color 8.3
training 8.3
occupation 8.2
student 8.2
alone 8.2
center 8
briefing 7.9
associate 7.9
associates 7.9
seminar 7.9
standing 7.8
agreement 7.8
workers 7.8
partner 7.7
jacket 7.7
attractive 7.7
collar 7.7
boss 7.7
musical instrument 7.6
chair 7.6
one person 7.5
senior 7.5
camera 7.4
phone 7.4
classroom 7.4
baron 7.2
bright 7.1
paper 7.1

Google
created on 2022-01-23

Black 89.6
Black-and-white 86.3
Style 83.9
Chair 83.6
Art 78.5
Suit 78.1
Font 77.6
Monochrome photography 74.6
Monochrome 73.1
Event 72.4
Room 67.5
History 65.1
Stock photography 62.9
Classic 60.8
Visual arts 59.9
Hat 59.4
Vintage clothing 57.6
Flag 56.9
Rectangle 55.4
Sitting 53.3

Microsoft
created on 2022-01-23

indoor 87.1
text 80.2
musical instrument 74.2
person 63.9

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 97.5%
Calm 35.9%
Happy 30.9%
Surprised 24.7%
Fear 5.6%
Sad 1.3%
Disgusted 0.7%
Angry 0.6%
Confused 0.3%

AWS Rekognition

Age 41-49
Gender Female, 86.8%
Calm 62.9%
Surprised 12.7%
Sad 9.7%
Confused 5%
Fear 3.3%
Disgusted 3.2%
Angry 2.4%
Happy 0.8%

AWS Rekognition

Age 20-28
Gender Female, 59.8%
Calm 70.5%
Sad 11.2%
Happy 10%
Confused 2.3%
Fear 2.1%
Surprised 1.5%
Angry 1.2%
Disgusted 1.2%

AWS Rekognition

Age 12-20
Gender Female, 56.5%
Sad 64.4%
Confused 21.6%
Calm 8.3%
Surprised 1.7%
Fear 1.3%
Angry 1.1%
Happy 0.9%
Disgusted 0.8%

AWS Rekognition

Age 31-41
Gender Female, 51.3%
Sad 72.3%
Calm 15.7%
Confused 4.6%
Angry 2.8%
Disgusted 1.6%
Happy 1.2%
Fear 1.2%
Surprised 0.8%

AWS Rekognition

Age 16-24
Gender Male, 94.5%
Calm 51.7%
Sad 32%
Surprised 3.6%
Fear 3.5%
Happy 3.2%
Disgusted 2.6%
Angry 1.8%
Confused 1.7%

AWS Rekognition

Age 21-29
Gender Female, 91.4%
Confused 38.2%
Sad 22.3%
Calm 18.9%
Fear 9.6%
Disgusted 4.6%
Happy 2.3%
Surprised 2.3%
Angry 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Bed 79.3%

Captions

Microsoft

a group of people on a bed 62.3%
a group of people standing in front of a window 62.2%
a group of people in a room 62.1%

Text analysis

Amazon

NAGON
YTEBAS NAGON
28
YTEBAS

Google

8
te3 8 2
te3
2