Human Generated Data

Title

Untitled (woman cutting cake at banquet table)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13767

Human Generated Data

Title

Untitled (woman cutting cake at banquet table)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99
Human 99
Person 98.7
Person 97.3
Person 97.2
Person 97.1
Indoors 96.3
Interior Design 96.3
Person 92.4
Person 92.2
Crowd 91.5
Room 90.4
Audience 87.9
Person 84.8
Person 79.9
Person 74
Person 64.8
Person 64
Hall 55.1
Theater 55.1
Auditorium 55.1
Person 43.1

Imagga
created on 2022-02-04

theater 39.2
building 28.1
stage 27.7
home theater 26.7
structure 26.5
cinema 26
people 21.2
silhouette 19.9
person 19.7
audience 19.5
platform 19.3
lights 18.5
event 17.6
symbol 17.5
player 17
crowd 16.3
television 16
training 15.7
nation 15.1
male 14.9
cheering 14.7
stadium 14.6
billboard 14.5
blackboard 14.5
skill 14.4
patriotic 14.4
flag 13.8
nighttime 13.7
championship 13.6
match 13.5
man 13.5
muscular 13.4
athlete 13.3
office 13.1
competition 12.8
icon 12.7
night 12.4
sport 12.3
design 11.8
field 11.7
team 11.6
park 11.6
business 11.5
black 11.4
signboard 11
architecture 10.9
light 10.7
room 10.3
bright 10
theater curtain 9.8
curtain 9.8
vibrant 9.6
screen 9.6
glowing 9.2
classroom 9.2
interior 8.8
chalkboard 8.8
court 8.8
shiny 8.7
smile 8.5
computer 8.5
city 8.3
world 8.2
indoor 8.2
global 8.2
happy 8.1
businessman 7.9
backhand 7.9
versus 7.9
racket 7.8
work 7.8
shorts 7.8
serve 7.8
tennis 7.8
window 7.8
modern 7.7
hand 7.7
old 7.7
child 7.6
house 7.5
famous 7.4
teamwork 7.4
entertainment 7.4
school 7.3
music 7.2
desk 7.1
travel 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 87.7
black and white 85.1
person 72.5
white 66.6

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 93%
Calm 38.1%
Confused 30.9%
Sad 14.9%
Angry 6%
Disgusted 5%
Happy 2.5%
Surprised 1.6%
Fear 1.1%

AWS Rekognition

Age 29-39
Gender Female, 88.6%
Calm 99.9%
Sad 0%
Happy 0%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 12-20
Gender Male, 67.6%
Calm 74.8%
Sad 12.3%
Confused 7.6%
Happy 1.9%
Angry 1%
Fear 0.9%
Disgusted 0.8%
Surprised 0.7%

AWS Rekognition

Age 18-24
Gender Female, 99.8%
Calm 46.6%
Happy 37.3%
Sad 12.2%
Angry 1.9%
Confused 0.8%
Fear 0.5%
Surprised 0.4%
Disgusted 0.3%

AWS Rekognition

Age 23-33
Gender Male, 93%
Calm 93.2%
Sad 6.3%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0%
Surprised 0%

AWS Rekognition

Age 37-45
Gender Male, 100%
Happy 32.6%
Sad 24.9%
Disgusted 19.3%
Calm 9.1%
Angry 6%
Surprised 3.8%
Fear 2.2%
Confused 2.1%

AWS Rekognition

Age 13-21
Gender Female, 99.1%
Sad 47.8%
Calm 24.2%
Happy 22.1%
Confused 3.6%
Angry 0.9%
Disgusted 0.8%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 74%
Calm 50.1%
Sad 23.3%
Happy 15.5%
Fear 3.2%
Disgusted 3.2%
Confused 2.7%
Angry 1.1%
Surprised 0.9%

AWS Rekognition

Age 52-60
Gender Male, 99.8%
Sad 67.4%
Calm 13.2%
Confused 12.9%
Angry 1.9%
Happy 1.7%
Disgusted 1.6%
Fear 0.6%
Surprised 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a group of people standing in front of a window 65.6%
a group of people in a room 65.5%
a group of people in front of a window 65.4%

Text analysis

Amazon

CLUB
ROS
LL
THE
CO-OPERATIVE
N
ID
a
a DE
DE

Google

CLUB
CO-OPERATIVE
THE
CO-OPERATIVE CLUB THE ROS LL
LL
ROS