Human Generated Data

Title

Untitled (portrait of wedding party in church)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3062

Human Generated Data

Title

Untitled (portrait of wedding party in church)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3062

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Person 99.3
Human 99.3
Person 99.2
Person 99.1
Person 99.1
Indoors 98.8
Person 96.9
Room 95.9
Person 94.8
Person 94.7
Person 92.9
People 86.1
Funeral 85.1
Suit 84.9
Clothing 84.9
Coat 84.9
Overcoat 84.9
Apparel 84.9
Suit 70.6
Aisle 62
Church 58.4
Building 58.4
Architecture 58.4
Hall 56.6
Court 55.2

Clarifai
created on 2023-10-26

people 99.9
group 99.2
woman 97.8
man 97.6
adult 97.4
many 94.2
leader 93.9
group together 93.9
indoors 93.5
administration 93.1
room 89.8
education 88.1
chair 88
furniture 87.6
audience 85.6
music 85.3
child 84.6
ceremony 83.6
school 82.4
monochrome 81.9

Imagga
created on 2022-01-21

marimba 67.5
percussion instrument 57.2
room 55.6
classroom 49.1
musical instrument 47.4
table 32.9
interior 30.1
chair 28.4
people 25.6
office 22
person 21.8
man 21.5
business 20.6
meeting 19.8
indoors 19.3
male 19.1
group 18.5
modern 18.2
women 18.2
businessman 17.6
restaurant 17.1
hall 17
adult 16.7
student 16.1
home 15.9
sitting 15.5
teacher 15.4
desk 15.1
corporate 14.6
indoor 14.6
men 14.6
happy 14.4
team 14.3
work 14.1
house 13.4
dining 13.3
professional 13.2
together 13.1
smiling 13
businesswoman 12.7
communication 12.6
executive 12.5
glass 12.4
education 12.1
food 12.1
decor 11.5
talking 11.4
design 11.2
teamwork 11.1
inside 11
lifestyle 10.8
wood 10.8
kitchen 10.7
job 10.6
life 10.6
cheerful 10.6
furniture 10.5
couple 10.4
businesspeople 10.4
manager 10.2
floor 10.2
wine 10.2
dinner 10.1
drink 10
suit 9.9
conference 9.8
colleagues 9.7
class 9.6
mature 9.3
window 9.2
success 8.8
blackboard 8.6
smile 8.5
outfit 8.5
learning 8.5
coffee 8.3
successful 8.2
school 8.2
stool 7.9
tables 7.9
coworkers 7.9
day 7.8
happiness 7.8
worker 7.8
employee 7.8
brass 7.8
portrait 7.8
seat 7.7
party 7.7
contemporary 7.5
presentation 7.4
style 7.4
service 7.4
meal 7.3
confident 7.3
board 7.2
love 7.1
speaker 7
counter 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

funeral 86.9
text 79.7
candle 74.9
black and white 71
black 68
person 61.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Happy 55.1%
Calm 40.2%
Sad 2.2%
Confused 0.9%
Surprised 0.6%
Angry 0.3%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Calm 87.7%
Sad 8.4%
Confused 1.6%
Disgusted 0.8%
Angry 0.5%
Happy 0.5%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 40-48
Gender Male, 91.2%
Sad 69.1%
Calm 25.9%
Confused 2.6%
Happy 1.5%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 64.3%
Calm 62.9%
Happy 30.7%
Sad 2.8%
Surprised 1.2%
Confused 1.1%
Disgusted 0.9%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Female, 50.1%
Calm 42.9%
Sad 29.8%
Happy 23.5%
Confused 2.3%
Disgusted 0.5%
Surprised 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Male, 99.8%
Sad 89.7%
Calm 7.1%
Happy 1.3%
Confused 1.1%
Disgusted 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.5%
Sad 98%
Confused 1%
Calm 0.4%
Happy 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Suit 84.9%

Text analysis

Amazon

600
5
SEL.
239
38
2
SAFETY
BOOK
SEL

Google

381
381