Human Generated Data

Title

Untitled (female graduate in cap and gown receiving diploma)

Date

1948

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2970

Human Generated Data

Title

Untitled (female graduate in cap and gown receiving diploma)

People

Artist: Harry Annas, American 1897 - 1980

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2970

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Stage 99.4
Person 98.9
Human 98.9
Person 97.2
Person 96.3
Person 93.4
Person 91.5
Person 91.2
Musical Instrument 90.3
Musician 90.3
Interior Design 87.9
Indoors 87.9
Person 87
Room 85.7
Crowd 84.6
Person 82.4
Person 78.1
Person 76.4
Person 72.1
Music Band 71.6
Person 71.5
Person 71
Person 67.7
Clothing 65.4
Apparel 65.4
Leisure Activities 65.1
Face 62.5
Theater 59.6
Officer 56.8
Military 56.8
Military Uniform 56.8

Clarifai
created on 2023-10-26

people 99.7
wear 98.3
music 97.8
audience 97.3
adult 97.1
man 97.1
administration 96.9
group together 96
many 94.7
woman 94.5
musician 93
group 88.3
child 87
singer 83.5
uniform 83.1
adolescent 77
crowd 75.1
spectator 75
drum 74
drummer 74

Imagga
created on 2022-01-21

stage 90
platform 71.2
percussion instrument 31.9
musical instrument 31.3
people 29.5
person 25.9
marimba 25.2
man 24.2
business 22.5
group 21.7
male 21.3
men 18
businessman 17.6
silhouette 17.4
office 17
room 16.6
education 16.4
classroom 16.1
meeting 15.1
class 14.4
teacher 13.8
board 13.6
adult 13.3
blackboard 13
work 12.5
table 12.1
vibraphone 12
worker 11.6
corporate 11.2
student 11.1
school 11
crowd 10.5
music 10
team 9.8
interior 9.7
black 9.6
women 9.5
indoor 9.1
hand 9.1
modern 9.1
device 8.9
job 8.8
audience 8.8
executive 8.4
communication 8.4
manager 8.4
professional 8.3
confident 8.2
businesswoman 8.2
happy 8.1
sunset 8.1
speaker 8
center 8
to 8
lifestyle 7.9
conference 7.8
teaching 7.8
glass 7.8
concert 7.8
chair 7.7
performance 7.6
desk 7.5
learning 7.5
study 7.5
symbol 7.4
success 7.2
indoors 7
sky 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

person 95.2
man 67.7
clothing 66.7
concert 65.7
text 61.2
old 47.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 100%
Sad 93.1%
Confused 4.6%
Calm 1.4%
Angry 0.5%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 14-22
Gender Female, 66.6%
Sad 48.2%
Calm 40.6%
Happy 5.1%
Confused 3.2%
Disgusted 0.9%
Fear 0.9%
Angry 0.8%
Surprised 0.3%

AWS Rekognition

Age 26-36
Gender Male, 99.2%
Sad 98%
Confused 1.3%
Angry 0.2%
Calm 0.2%
Disgusted 0.2%
Fear 0.1%
Happy 0%
Surprised 0%

AWS Rekognition

Age 30-40
Gender Male, 97.4%
Calm 52.2%
Confused 25.6%
Sad 13.5%
Happy 3.9%
Angry 1.6%
Surprised 1.3%
Disgusted 1.3%
Fear 0.7%

AWS Rekognition

Age 28-38
Gender Male, 72.5%
Calm 99.3%
Sad 0.4%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 98.9%

Categories

Text analysis

Amazon

to