Human Generated Data

Title

Untitled (group of members at Grange event, Derry, NH)

Date

January 1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18014

Human Generated Data

Title

Untitled (group of members at Grange event, Derry, NH)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

January 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18014

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.2
Human 99.2
Person 99.1
Person 99
Person 98.9
Person 98.8
Musician 98.8
Musical Instrument 98.8
Person 98.5
Music Band 96.1
Person 95.8
Person 94.1
Crowd 92.6
Person 91
Person 90.9
Stage 90.8
Person 90.8
Person 90.2
Leisure Activities 84.6
Person 84.3
Room 80.9
Indoors 80.9
Suit 75.3
Coat 75.3
Clothing 75.3
Overcoat 75.3
Apparel 75.3
Person 73
Person 72.7
Person 67.8
Concert 63.6
Shoe 62.7
Footwear 62.7
Theater 62.2
Shoe 50.3

Clarifai
created on 2023-10-29

people 99.9
group together 99.1
many 97.3
adult 97.1
man 96.4
group 96.3
woman 93.7
wear 91.5
recreation 88.6
sports equipment 86.8
education 85.2
music 84.2
competition 84
athlete 80.9
outfit 80.9
school 79.7
child 79.5
dancing 78.1
uniform 75.9
several 72.9

Imagga
created on 2022-03-04

player 40.5
golfer 39.8
person 36.9
man 33.6
people 30.7
contestant 29.2
male 29.1
silhouette 27.3
businessman 21.2
adult 19.1
business 18.8
men 17.2
group 16.9
professional 16.8
sport 15.4
brass 15.1
musical instrument 14.7
athlete 14
blackboard 14
black 13.9
lifestyle 13.7
success 13.7
wind instrument 13.6
dancer 13.5
job 13.3
hand 12.9
dance 12.7
team 12.5
gymnasium 12.3
couple 12.2
ball 12
suit 11.7
sunset 11.7
active 11.3
boy 11.3
human 11.2
competition 11
exercise 10.9
fitness 10.8
looking 10.4
stage 10.4
athletic facility 10
performer 9.8
portrait 9.7
together 9.6
women 9.5
work 9.4
facility 9.4
happy 9.4
dark 9.2
pretty 9.1
fun 9
one 9
style 8.9
crowd 8.6
employee 8.6
party 8.6
youth 8.5
beach 8.4
basketball 8.4
modern 8.4
attractive 8.4
fashion 8.3
sign 8.3
outdoors 8.2
teenager 8.2
recreation 8.1
handsome 8
worker 8
body 8
sea 7.8
dancing 7.7
two 7.6
career 7.6
sports equipment 7.6
communication 7.6
guy 7.5
friends 7.5
friendship 7.5
holding 7.4
symbol 7.4
clothing 7.4
stringed instrument 7.3
water 7.3
design 7.3
guitar 7.1
love 7.1
sky 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

musical instrument 99.5
person 99.4
text 96.5
posing 86.1
standing 82.5
guitar 81.6
concert 79.6
player 77.8
black 77.5
accordion 77.2
black and white 74.9
white 74.9
drum 73.6
group 64
saxophone 61.6
folk instrument 60
old 59
people 58
female 27.1
line 22.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Sad 97.9%
Disgusted 0.6%
Happy 0.5%
Confused 0.4%
Calm 0.2%
Angry 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 97.9%
Happy 0.7%
Surprised 0.5%
Disgusted 0.4%
Sad 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.2%
Calm 75.9%
Happy 18%
Confused 2.1%
Surprised 1.4%
Disgusted 1.4%
Sad 0.6%
Fear 0.5%
Angry 0.2%

AWS Rekognition

Age 36-44
Gender Male, 95.7%
Calm 66.6%
Happy 22.5%
Confused 3.4%
Surprised 3.3%
Sad 1.8%
Disgusted 1.6%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Sad 42.7%
Confused 22%
Happy 15.2%
Calm 14.1%
Surprised 1.8%
Disgusted 1.7%
Angry 1.6%
Fear 0.9%

AWS Rekognition

Age 39-47
Gender Male, 100%
Calm 96.3%
Sad 3.1%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Calm 93.5%
Sad 4.2%
Happy 0.8%
Disgusted 0.5%
Confused 0.4%
Surprised 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 77.6%
Happy 20.2%
Surprised 1%
Confused 0.4%
Disgusted 0.4%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Calm 44.5%
Sad 25.7%
Surprised 17.3%
Confused 4.8%
Happy 4.7%
Disgusted 1.4%
Angry 1.3%
Fear 0.4%

AWS Rekognition

Age 33-41
Gender Male, 99.5%
Calm 98.7%
Sad 0.7%
Happy 0.4%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Calm 99.8%
Surprised 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Sad 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Calm 85.8%
Sad 11.3%
Confused 1%
Happy 0.7%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%
Surprised 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 98%
Sad 0.8%
Confused 0.6%
Surprised 0.4%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 48-54
Gender Male, 99.2%
Calm 55.6%
Sad 16%
Disgusted 8%
Confused 7.6%
Surprised 5.7%
Happy 3.8%
Angry 1.7%
Fear 1.5%

AWS Rekognition

Age 34-42
Gender Male, 54.3%
Calm 74%
Surprised 9.5%
Sad 6.2%
Happy 5%
Confused 3.5%
Angry 0.7%
Disgusted 0.7%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.2%
Person 99.1%
Person 99%
Person 98.9%
Person 98.8%
Person 98.5%
Person 95.8%
Person 94.1%
Person 91%
Person 90.9%
Person 90.8%
Person 90.2%
Person 84.3%
Person 73%
Person 72.7%
Person 67.8%
Shoe 62.7%
Shoe 50.3%

Text analysis

Amazon

J20
AS
LAS
F
S
7
GY
KODAA
EC
З

Google

YT37A°2- XAGO
YT37A°2-
XAGO