Human Generated Data

Title

Untitled (Mask and Wig members dancing in a line)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10666

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask and Wig members dancing in a line)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.1
Person 99.1
Person 99
Person 98.9
Person 97.7
Person 97
Person 96.8
Person 95.7
Shorts 94.3
Clothing 94.3
Apparel 94.3
Person 92.9
Person 90.4
Sports 87.1
Sport 87.1
Person 84.7
People 77.4
Face 66.1
Portrait 66
Photo 66
Photography 66
Leisure Activities 65.3
Working Out 61.9
Exercise 61.9
Flooring 60.9
Crowd 59.8
Dance Pose 58.2
Floor 58.2
Meal 57.8
Food 57.8
Fitness 57.2
Female 56.9
Girl 56.9

Imagga
created on 2022-01-15

classroom 44.4
room 42.8
business 32.2
man 29.6
hall 29.3
businessman 27.4
people 27.3
office 26.7
meeting 26.4
table 26
group 25
professional 24.7
male 24.1
person 23.3
laptop 21.5
teacher 20.8
team 20.6
work 20.4
teamwork 20.4
computer 20.3
communication 20.1
sitting 18.9
chair 18.9
corporate 18
adult 17.3
businesswoman 17.3
modern 16.8
working 16.8
education 16.4
executive 16.4
men 16.3
businesspeople 16.1
happy 15.7
center 15.1
success 14.5
smiling 14.5
desk 14.2
indoors 14.1
student 13.4
job 13.3
building 13.2
manager 13
women 12.6
lifestyle 12.3
indoor 11.9
suit 11.8
talking 11.4
together 11.4
screen 11.4
training 11.1
portrait 11
conference 10.7
seat 10.5
university 10.3
school 10.1
relaxation 10
board 9.9
teaching 9.7
interior 9.7
colleagues 9.7
technology 9.6
looking 9.6
workplace 9.5
color 9.5
company 9.3
smile 9.3
confident 9.1
worker 9
new 8.9
seminar 8.8
partners 8.7
entrepreneur 8.7
class 8.7
empty 8.6
study 8.4
row 8.4
presentation 8.4
blackboard 8.3
restaurant 8.3
occupation 8.2
equipment 8.1
monitor 8
employee 7.9
standing 7.8
hands 7.8
black 7.8
lounge 7.8
audience 7.8
discussion 7.8
glass 7.8
mid adult 7.7
busy 7.7
using 7.7
hand 7.6
educator 7.6
cinema 7.4
back 7.3
successful 7.3
theater 7.3
happiness 7

Google
created on 2022-01-15

Photograph 94.1
Font 81.5
Suit 75.9
Snapshot 74.3
Event 72.8
Chair 72.5
Photo caption 69.5
Room 66.2
Team 65.8
History 64.9
Monochrome 64.8
Monochrome photography 64.8
Advertising 62.6
Crowd 62.1
Stock photography 61.9
Rectangle 60.9
Art 57.9
Class 54.2
Crew 51.4
Sitting 51

Microsoft
created on 2022-01-15

text 99
person 91.4
outdoor 87.6
dance 84.1
posing 44.2

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Confused 47.2%
Sad 42.5%
Happy 4.4%
Disgusted 2.5%
Surprised 1.3%
Angry 1%
Calm 0.6%
Fear 0.4%

AWS Rekognition

Age 51-59
Gender Male, 99.6%
Calm 77.8%
Sad 14.3%
Happy 4.6%
Surprised 1.2%
Disgusted 0.6%
Confused 0.6%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 40.5%
Sad 21.8%
Happy 15.9%
Surprised 8.6%
Confused 7.4%
Disgusted 2.7%
Fear 1.7%
Angry 1.4%

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Surprised 46.8%
Sad 19.8%
Confused 11.4%
Calm 8.5%
Happy 5.1%
Disgusted 3.7%
Fear 2.5%
Angry 2.2%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Calm 91.7%
Confused 4.3%
Sad 1.5%
Surprised 1.1%
Disgusted 0.4%
Happy 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 51.5%
Sad 75.9%
Calm 17.2%
Fear 2.4%
Angry 2.1%
Confused 0.9%
Happy 0.6%
Disgusted 0.6%
Surprised 0.3%

AWS Rekognition

Age 42-50
Gender Male, 66.8%
Calm 31.5%
Sad 31.2%
Confused 19.3%
Disgusted 6.5%
Happy 3.9%
Angry 2.8%
Fear 2.7%
Surprised 2.1%

AWS Rekognition

Age 23-31
Gender Female, 57.3%
Calm 94.1%
Happy 2%
Sad 1.8%
Fear 0.6%
Disgusted 0.5%
Surprised 0.5%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 25-35
Gender Female, 71.1%
Sad 92.7%
Calm 2.5%
Disgusted 1.5%
Fear 1%
Confused 1%
Angry 0.8%
Surprised 0.3%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people posing for a photo 72.5%
a group of people standing in front of a building 72.4%
a group of people posing for a picture 72.3%

Text analysis

Amazon

150
as
21011.
PENNSYLVALI
21001.
2 / 0 11 .
V1
XAGOX V1
XAGOX

Google

21011. 1 TROY NOKINTS 150 EHIYLNA
1
150
NOKINTS
EHIYLNA
21011.
TROY