Human Generated Data

Title

Untitled (performers on stage, crowd watching)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20073

Human Generated Data

Title

Untitled (performers on stage, crowd watching)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20073

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.4
Person 99.1
Person 99
Person 97.6
Person 95.6
Person 95.3
Person 94.9
Person 94.2
Interior Design 93.9
Indoors 93.9
Person 91.7
Room 91
Person 90.6
Person 88.2
Person 84.7
Person 78.4
Person 78.4
Meal 77.1
Food 77.1
Restaurant 76.8
Crowd 75.7
Theme Park 69.9
Amusement Park 69.9
People 67.1
Person 65.9
Person 65.5
Person 60.9
Cafeteria 58.8
Classroom 57.5
School 57.5

Clarifai
created on 2023-10-22

people 99.8
group 98.6
woman 97.6
man 96.9
many 96.8
adult 95.2
monochrome 94.3
child 94.1
group together 93.5
music 93.1
recreation 90.7
education 90
indoors 86.4
crowd 86.4
school 83.8
furniture 83.1
dancing 81.9
audience 81.7
several 81
musician 80.3

Imagga
created on 2022-03-05

stage 45.3
musician 28.7
platform 28
singer 26.4
music 24.7
man 23.5
people 21.2
barroom 20.3
male 19.8
sax 19.1
performer 19
person 18.7
concert 18.4
guitar 17.1
adult 16.2
musical 15.3
group 15.3
player 15.1
rock 13.9
performance 13.4
musical instrument 13.3
black 13.2
sound 13.1
wind instrument 13.1
shop 12.9
men 12.9
bass 12.9
brass 12.6
instrument 12.1
play 12.1
entertainment 12
guitarist 10.8
band 10.7
business 10.3
lifestyle 10.1
studio 9.9
show 9.5
party 9.4
building 9.4
club 9.4
house 9.2
art 9.1
modern 9.1
hand 9.1
style 8.9
entertainer 8.9
indoors 8.8
crowd 8.6
student 8.5
design 8.4
star 8.4
city 8.3
teacher 8.3
street 8.3
fun 8.2
playing 8.2
professional 8.1
room 8
home 8
sing 7.8
audio 7.6
inside 7.4
microphone 7.3
team 7.2
handsome 7.1
women 7.1
work 7.1
businessman 7.1
stringed instrument 7

Google
created on 2022-03-05

Photograph 94.3
Black 89.8
Black-and-white 86.8
Style 84.1
Chair 82.9
People 78.3
Monochrome 77.1
Crowd 77
Monochrome photography 76.6
Snapshot 74.3
Event 74.2
Hat 71.2
Room 68.7
Suit 68.6
Stock photography 63.5
History 62.1
Fun 61.8
Font 61
Art 58.2
T-shirt 56.1

Microsoft
created on 2022-03-05

person 97.6
clothing 96.1
text 91.8
man 86.5
group 70.9
dance 54.7
crowd 1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 74.6%
Calm 46.6%
Happy 29.5%
Sad 7.1%
Fear 5.2%
Angry 4%
Disgusted 3.7%
Surprised 2.3%
Confused 1.7%

AWS Rekognition

Age 22-30
Gender Male, 86.6%
Calm 84.6%
Angry 7%
Sad 6.4%
Fear 0.5%
Confused 0.4%
Surprised 0.4%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 20-28
Gender Male, 65.8%
Sad 58.9%
Calm 20.7%
Fear 9.8%
Confused 2.4%
Surprised 2.4%
Angry 2.3%
Disgusted 2%
Happy 1.5%

Feature analysis

Amazon

Person
Person 99.6%
Person 99.4%
Person 99.1%
Person 99%
Person 97.6%
Person 95.6%
Person 95.3%
Person 94.9%
Person 94.2%
Person 91.7%
Person 90.6%
Person 88.2%
Person 84.7%
Person 78.4%
Person 78.4%
Person 65.9%
Person 65.5%
Person 60.9%

Categories

Text analysis

Amazon

9
KODAK
EXIT
SAFETY
PILA

Google

SAFET
SAFET