Human Generated Data

Title

Untitled (Marymount Operetta: attendants flanking king and queen in performance of "The Gondoliers")

Date

1939

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13000

Human Generated Data

Title

Untitled (Marymount Operetta: attendants flanking king and queen in performance of "The Gondoliers")

People

Artist: Curtis Studio, American active 1891 - 1935

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13000

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Stage 99.9
Person 99.5
Human 99.5
Person 99.5
Person 99.5
Person 99.5
Person 99.4
Person 99.2
Person 98.1
Person 95.9
Clothing 78.6
Apparel 78.6
Crowd 74.8
Person 73.2
Shorts 66.7
People 62.3
Leisure Activities 60.4
Room 58.5
Indoors 58.5
Dance Pose 56.5
Shoe 55.5
Footwear 55.5

Clarifai
created on 2023-10-29

people 99.8
group together 99.5
many 98.2
group 97.9
school 96.6
child 96.1
athlete 95.9
woman 95.4
adult 95.2
man 95.1
education 95
boy 94.7
competition 94.6
squad 92.7
victory 92
uniform 92
outfit 88.1
trophy 87.1
recreation 85.4
crowd 84.6

Imagga
created on 2022-02-05

people 32.3
man 29.7
group 24.1
male 24.1
person 24
adult 21.3
women 20.5
business 19.4
men 18.9
silhouette 17.4
couple 16.5
businessman 15.9
suit 15.3
happy 15
dance 14.8
happiness 14.1
black 13.7
dancer 13.4
party 12.9
human 12.7
professional 12.3
room 12.1
fashion 12
two 11.8
love 11.8
dress 11.7
lifestyle 11.5
crowd 11.5
together 11.4
sitting 11.1
stage 10.9
leisure 10.8
interior 10.6
dancing 10.6
celebration 10.4
portrait 10.3
corporate 10.3
musical instrument 10.2
life 9.9
team 9.8
modern 9.8
kin 9.8
pretty 9.8
job 9.7
fun 9.7
indoors 9.7
style 9.6
light 9.3
executive 9.3
performer 9.2
dark 9.2
hand 9.1
holding 9.1
wind instrument 9
office 8.9
cheerful 8.9
lady 8.9
night 8.9
looking 8.8
worker 8.7
smiling 8.7
work 8.6
chair 8.5
meeting 8.5
casual 8.5
friends 8.4
brass 8.4
attractive 8.4
active 8.2
businesswoman 8.2
platform 8.1
activity 8.1
window 7.9
holiday 7.9
design 7.9
smile 7.8
employee 7.8
luxury 7.7
disco 7.7
motion 7.7
wine 7.6
elegance 7.5
communication 7.5
clothing 7.4
teamwork 7.4
glass 7.4
shadow 7.2
romantic 7.1
blackboard 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

person 97.5
clothing 71.8
group 70.5
posing 61.1
people 58.2
dance 56.3
altar 11.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 94%
Calm 99.2%
Sad 0.3%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 20-28
Gender Male, 72.7%
Calm 85.1%
Happy 12.7%
Fear 1.2%
Disgusted 0.4%
Surprised 0.3%
Angry 0.2%
Sad 0.1%
Confused 0.1%

AWS Rekognition

Age 31-41
Gender Male, 97.3%
Happy 95.3%
Calm 4%
Surprised 0.2%
Confused 0.2%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 19-27
Gender Male, 77.6%
Happy 49.2%
Calm 44.3%
Sad 3.3%
Surprised 1.1%
Disgusted 0.7%
Confused 0.6%
Fear 0.5%
Angry 0.4%

AWS Rekognition

Age 23-31
Gender Male, 95.1%
Calm 97.2%
Surprised 2%
Happy 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Female, 52.7%
Calm 99.3%
Happy 0.3%
Fear 0.1%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 24-34
Gender Male, 68.3%
Calm 80%
Sad 12.8%
Surprised 4.4%
Angry 1.4%
Disgusted 0.5%
Happy 0.4%
Confused 0.3%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 92%
Calm 98.5%
Sad 0.7%
Confused 0.3%
Happy 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 65.3%
Calm 99.5%
Sad 0.2%
Happy 0.1%
Fear 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.5%
Person 99.5%
Person 99.5%
Person 99.5%
Person 99.4%
Person 99.2%
Person 98.1%
Person 95.9%
Person 73.2%
Shoe 55.5%

Text analysis

Amazon

8-58418