Human Generated Data

Title

Untitled (performers on stage)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20220

Human Generated Data

Title

Untitled (performers on stage)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20220

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Dance Pose 99.6
Leisure Activities 99.6
Person 99.3
Human 99.3
Person 99.3
Person 99.2
Person 99.1
Person 97.8
Person 97.4
Person 94.1
Dance 93.7
Performer 93.3
Interior Design 93
Indoors 93
Person 91.4
Tango 86.9
Room 75.3
Person 66.8
Flooring 64.1
Photography 63.1
Photo 63.1
Person 60

Clarifai
created on 2023-10-22

people 99.9
group 98.9
group together 97.1
adult 96.1
woman 95.7
man 94.9
monochrome 93.9
furniture 92.9
two 92.4
canine 92
music 89
actress 88.8
actor 88.2
several 87.7
dog 86.2
movie 85
recreation 84.4
wear 83.7
room 83.7
leader 83.6

Imagga
created on 2022-03-05

musical instrument 61.3
accordion 48.8
keyboard instrument 39
wind instrument 37.6
person 31.9
dancer 30.7
performer 28.8
adult 23.9
man 21.5
people 20.6
male 18.5
silhouette 17.4
entertainer 17.3
sport 17.1
posing 15.1
fashion 15.1
black 15
model 14.8
dark 14.2
exercise 13.6
fitness 13.6
couple 13.1
lifestyle 13
action 13
sexy 12.8
one 12.7
attractive 12.6
city 12.5
leisure 12.5
urban 12.2
boy 12.2
lady 12.2
body 12
pretty 11.9
active 11.7
motion 11.1
portrait 11
dress 10.8
sunset 10.8
happy 10.7
run 10.6
style 10.4
hair 10.3
outside 10.3
pose 10
outdoor 9.9
dance 9.9
modern 9.8
performance 9.6
athlete 9.5
sensual 9.1
cool 8.9
dancing 8.7
sitting 8.6
men 8.6
guy 8.5
elegance 8.4
health 8.3
human 8.2
teenager 8.2
dirty 8.1
activity 8.1
sun 8
water 8
women 7.9
youth 7.7
moving 7.6
casual 7.6
walk 7.6
walking 7.6
legs 7.5
fun 7.5
outdoors 7.5
training 7.4
competition 7.3
equipment 7.3
sensuality 7.3
summer 7.1
day 7.1
clothing 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

dance 94.5
person 94.3
text 93.4
clothing 92.9
black and white 85.5
footwear 81.5
woman 72.6
dress 67
image 32

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 83.3%
Calm 95%
Sad 3.2%
Confused 1%
Surprised 0.2%
Happy 0.2%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.3%
Person 99.2%
Person 99.1%
Person 97.8%
Person 97.4%
Person 94.1%
Person 91.4%
Person 66.8%
Person 60%

Categories

Text analysis

Amazon

2

Google

YT37A°2-XAGON
YT37A°2-XAGON