Human Generated Data

Title

Untitled (children seated at table, Methodist Church)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2764

Human Generated Data

Title

Untitled (children seated at table, Methodist Church)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2764

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.6
Human 99.6
Person 98.5
Person 98.4
Person 97.4
Person 96.4
Person 96.2
Person 95.6
Apparel 95.1
Clothing 95.1
Person 93.5
Person 92.5
Person 90
Chair 88.3
Furniture 88.3
Room 86.7
Indoors 86.7
People 86.5
Person 78.7
Person 78.4
Female 75.6
Person 73.1
Table 72.2
Kindergarten 72.1
Person 71.5
Kid 70.8
Child 70.8
Dress 68.9
Face 68.8
Girl 68.2
Living Room 63.2
Baby 62.7
Dining Table 62.4
Sitting 60.8
Suit 58.3
Coat 58.3
Overcoat 58.3
Person 58.1
Person 58
Person 56.8

Clarifai
created on 2023-10-26

people 99.9
group 99.7
child 99.1
education 98.3
monochrome 97.4
teacher 96.8
group together 96.2
woman 96.1
adult 96.1
elementary school 94.5
school 94.3
man 93.6
classroom 93.5
indoors 91.9
many 91.9
room 90.4
administration 88.8
several 87.9
family 87.7
adolescent 87.1

Imagga
created on 2022-01-16

man 31.6
male 30.5
person 30
people 29
brass 25.9
wind instrument 23.5
businessman 22.9
adult 22.9
musical instrument 22.4
business 20.6
room 19.9
teacher 19.4
men 18.9
classroom 17
professional 15.8
couple 15.7
job 15
women 15
group 14.5
meeting 14.1
table 13.8
office 13.6
home 13.5
blackboard 13.2
life 13.1
businesswoman 12.7
worker 12.7
happy 12.5
smiling 12.3
senior 12.2
family 11.6
executive 11.5
interior 11.5
education 11.3
corporate 11.2
sitting 11.2
work 11.1
happiness 11
indoor 10.9
team 10.7
indoors 10.5
new 10.5
modern 10.5
portrait 10.3
teamwork 10.2
two 10.2
school 10.2
lifestyle 10.1
student 9.9
holding 9.9
old 9.7
businesspeople 9.5
smile 9.3
communication 9.2
cheerful 8.9
educator 8.9
to 8.8
conference 8.8
kin 8.8
partner 8.7
boy 8.7
device 8.6
chair 8.6
youth 8.5
study 8.4
manager 8.4
success 8
child 8
cornet 7.9
paper 7.8
color 7.8
mother 7.8
class 7.7
desk 7.6
fashion 7.5
learning 7.5
fun 7.5
vintage 7.4
mature 7.4
occupation 7.3
successful 7.3
girls 7.3
black 7.2
suit 7.2
kitchen 7.1
handsome 7.1
idea 7.1
working 7.1
together 7

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

person 97.8
clothing 96.2
text 95.4
man 73.9
woman 72.3
drawing 69
cartoon 56

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.6%
Happy 90.7%
Disgusted 2.6%
Surprised 2.2%
Calm 1.9%
Fear 0.9%
Sad 0.6%
Angry 0.6%
Confused 0.5%

AWS Rekognition

Age 16-24
Gender Male, 83.4%
Calm 97.1%
Angry 0.9%
Sad 0.9%
Confused 0.4%
Happy 0.3%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 9-17
Gender Female, 57.2%
Calm 98.4%
Sad 0.6%
Happy 0.3%
Confused 0.2%
Angry 0.2%
Surprised 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 90.7%
Calm 96.6%
Sad 1.9%
Confused 0.5%
Happy 0.3%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 11-19
Gender Male, 97.9%
Calm 90.5%
Angry 2.3%
Sad 1.9%
Surprised 1.9%
Fear 1.3%
Confused 0.9%
Disgusted 0.9%
Happy 0.3%

AWS Rekognition

Age 30-40
Gender Female, 98.7%
Happy 68.5%
Calm 27.3%
Fear 1%
Surprised 0.9%
Sad 0.7%
Angry 0.7%
Disgusted 0.6%
Confused 0.4%

AWS Rekognition

Age 27-37
Gender Male, 60.5%
Calm 81.6%
Happy 5.4%
Sad 4.4%
Surprised 4%
Fear 1.9%
Angry 1.2%
Disgusted 1.1%
Confused 0.3%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Calm 77%
Fear 5.9%
Sad 5.9%
Surprised 3.5%
Happy 3.2%
Angry 1.9%
Disgusted 1.5%
Confused 1.1%

AWS Rekognition

Age 23-31
Gender Male, 98.5%
Calm 99.8%
Surprised 0.1%
Angry 0%
Confused 0%
Sad 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 92.3%
Calm 58.2%
Sad 25.7%
Surprised 5.2%
Happy 5.1%
Angry 2.1%
Fear 1.8%
Confused 0.9%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Text analysis

Amazon

9
KODAK
SUFETY