Human Generated Data

Title

Untitled (performers on stage, crowd watching)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20074

Human Generated Data

Title

Untitled (performers on stage, crowd watching)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Person 96.5
Indoors 95.6
Room 95.6
Person 95.6
Person 95.3
Person 93.5
Person 92.6
Interior Design 90.8
Person 85
Person 83.9
Person 82.2
Person 76.9
People 74.8
Person 73.8
Person 72.8
Person 72.7
Person 72
Leisure Activities 71.6
Person 71.2
Person 68
Person 67.1
Crowd 62.9
Musician 61.2
Musical Instrument 61.2
Person 60.3
Food 60.3
Meal 60.3
Living Room 59.3
Dressing Room 58.3
Guitar 55.1

Imagga
created on 2022-03-05

room 31.5
interior 26.5
salon 23.3
table 21.1
indoors 21.1
modern 18.9
people 17.8
home 16.7
classroom 15.9
furniture 15.8
shop 14.5
lifestyle 14.4
chair 14.4
guitar 13.4
house 13.4
decor 13.2
person 13.1
inside 12.9
music 12.8
style 12.6
elegance 12.6
glass 12.6
man 12.1
light 12
adult 11.9
indoor 11.9
hall 11.8
design 11.8
musician 11.5
male 11.3
luxury 11.1
stage 11
architecture 10.9
fashion 10.5
business 10.3
3d 10.1
equipment 10
wood 10
window 9.9
comfortable 9.5
life 9.5
men 9.4
singer 9.4
black 9
group 8.9
restaurant 8.8
concert 8.7
urban 8.7
performer 8.7
women 8.7
rock 8.7
decoration 8.7
building 8.6
reflection 8.6
office 8.5
floor 8.4
studio 8.4
musical instrument 8.3
teacher 8.2
outfit 8.1
kitchen 8
bass 7.9
apartment 7.7
sofa 7.6
player 7.5
instrument 7.5
classic 7.4
team 7.2
work 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96.2
person 92.2
clothing 91.4
man 61

Face analysis

Amazon

Google

AWS Rekognition

Age 12-20
Gender Female, 57.2%
Calm 87.6%
Sad 9.1%
Happy 1.1%
Fear 0.8%
Angry 0.7%
Disgusted 0.3%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 14-22
Gender Female, 98.6%
Calm 80.9%
Sad 10.1%
Surprised 3.3%
Happy 3.2%
Confused 1%
Angry 0.7%
Disgusted 0.5%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Female, 99.6%
Calm 94.1%
Happy 2%
Sad 0.8%
Surprised 0.8%
Confused 0.6%
Angry 0.6%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 28-38
Gender Female, 83%
Calm 96.8%
Sad 2.9%
Surprised 0.2%
Confused 0.1%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 82.2%
Sad 57.2%
Calm 23.1%
Confused 8.6%
Angry 4.9%
Happy 2.6%
Disgusted 1.5%
Surprised 1.1%
Fear 1%

AWS Rekognition

Age 23-33
Gender Male, 65.8%
Calm 48.5%
Confused 23.8%
Sad 10.3%
Happy 8.1%
Disgusted 4.5%
Fear 2.1%
Surprised 1.3%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people posing for the camera 86.7%
a group of people posing for a photo 82.4%
a group of people posing for a picture 82.3%

Text analysis

Amazon

8
FILM
KODAK
ALBANS