Human Generated Data

Title

Untitled (presentation at conference)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20173

Human Generated Data

Title

Untitled (presentation at conference)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.2
Indoors 98.2
Person 98
Person 96.3
Person 94.5
Person 92.7
Person 92.1
Person 91
Person 91
Person 87.4
Person 85.5
Hall 82.7
Person 79.6
Room 79.2
Vehicle 70.3
Airplane 70.3
Transportation 70.3
Aircraft 70.3
Conference Room 69.4
Meeting Room 69.4
Person 65.8
Aisle 64.8
Theater 58.3
Auditorium 58.3
Person 48.8

Imagga
created on 2022-03-05

classroom 30.9
people 25.6
person 25.5
room 25
man 23.6
silhouette 22.3
teacher 20.8
male 19.1
businessman 18.5
group 18.5
business 18.2
black 18
gymnasium 15.6
education 15.6
symbol 15.5
student 15.4
blackboard 14.5
team 14.3
design 12.4
athletic facility 11.7
night 11.5
hand 11.4
meeting 11.3
school 11.2
grunge 11.1
board 10.8
adult 10.7
teaching 10.7
class 10.6
crowd 10.5
success 10.4
facility 10.4
men 10.3
idea 9.8
audience 9.7
drawing 9.6
art 9.6
hall 9.5
happy 9.4
professional 9.2
event 9.2
modern 9.1
human 9
nation 8.5
finance 8.4
sign 8.3
graphic 8
life 8
office 7.9
nighttime 7.8
stadium 7.8
silhouettes 7.8
party 7.7
patriotic 7.7
manager 7.4
sport 7.4
star 7.4
training 7.4
smiling 7.2
music 7.2
icon 7.1
love 7.1
child 7.1
bass 7

Google
created on 2022-03-05

Photograph 94.3
White 92.2
Bird 91.3
Black 90.2
Chair 89.6
Black-and-white 84.2
Style 84.1
Line 82
Suit 81.2
Font 79.8
Crowd 79.2
People 78.8
Art 78.6
Monochrome 77.5
Monochrome photography 76.5
Snapshot 74.3
Event 73.2
Stock photography 66.3
Room 65.2
Illustration 61.9

Microsoft
created on 2022-03-05

cartoon 90.5
person 89.6
window 85.5
text 72.1
clothing 71.9
black and white 55.6

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Female, 86.3%
Calm 97.1%
Sad 0.7%
Happy 0.6%
Confused 0.5%
Angry 0.4%
Fear 0.3%
Disgusted 0.2%
Surprised 0.2%

AWS Rekognition

Age 18-26
Gender Female, 63.5%
Calm 46.7%
Sad 13.1%
Angry 11.4%
Happy 8.3%
Disgusted 5.8%
Confused 5.5%
Fear 5%
Surprised 4.2%

AWS Rekognition

Age 13-21
Gender Female, 96.6%
Happy 49.5%
Calm 24.8%
Sad 8.4%
Fear 8.4%
Angry 5%
Surprised 1.7%
Confused 1.3%
Disgusted 0.9%

AWS Rekognition

Age 14-22
Gender Male, 88%
Calm 96.7%
Sad 1%
Fear 0.8%
Happy 0.6%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Female, 98.1%
Angry 51.6%
Calm 15.9%
Sad 11.8%
Happy 5.7%
Surprised 5.4%
Confused 3.7%
Fear 3.4%
Disgusted 2.4%

AWS Rekognition

Age 18-26
Gender Male, 62.7%
Calm 95.2%
Sad 1.6%
Happy 1.4%
Fear 0.8%
Surprised 0.4%
Confused 0.2%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Calm 90.4%
Sad 3.5%
Confused 1.9%
Fear 1.2%
Happy 1.1%
Angry 1%
Disgusted 0.5%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.4%
Airplane 70.3%

Captions

Microsoft

a group of people standing in front of a window 70.5%
a crowd of people standing in front of a window 70.4%
a group of people in front of a window 69.8%

Text analysis

Amazon

S
EAD