Human Generated Data

Title

Untitled (group of men seated and standing)

Date

1949

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18286

Human Generated Data

Title

Untitled (group of men seated and standing)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18286

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Shorts 100
Clothing 100
Apparel 100
Person 99.3
Human 99.3
Person 99.1
Person 98.4
Person 97.8
Person 97.6
Person 95.9
Person 95.8
Person 95.5
Person 95.4
Person 95.3
Person 93.8
Chair 93.7
Furniture 93.7
Person 93
Person 92.8
Person 92.7
Indoors 83.2
People 82.1
Female 81.7
Tie 78.3
Accessories 78.3
Accessory 78.3
Room 74.4
Tie 74.1
Suit 72
Coat 72
Overcoat 72
Shoe 71.5
Footwear 71.5
Kid 63.7
Child 63.7
Dress 62.2
Floor 60.9
Girl 60.8
Skirt 60.3
Couch 60.3
Person 59.8
Woman 58.2
Classroom 56.8
School 56.8
Auditorium 55.2
Theater 55.2
Hall 55.2
Shoe 52.3

Clarifai
created on 2023-10-22

people 100
group 99.4
many 99
group together 98.7
education 97.9
man 97.2
adult 96.9
woman 96.6
child 94.9
school 94.5
uniform 93.9
wear 91.1
outfit 90.1
leader 89.5
position 88.4
boy 82.6
portrait 81.8
several 81.8
adolescent 78
recreation 77.7

Imagga
created on 2022-03-04

athlete 60.4
runner 57
person 43.1
contestant 35.2
people 31.2
sport 29.8
male 29.8
man 27.5
silhouette 22.3
men 22.3
group 20.9
competition 19.2
active 17.9
classroom 17.8
adult 17.8
run 17.3
couple 16.5
fitness 16.2
planner 15.6
room 15.6
exercise 15.4
business 15.2
lifestyle 15.2
women 15
businessman 15
player 14.2
team 13.4
crowd 12.5
together 12.3
happy 11.9
running 11.5
black 11.4
success 11.3
action 11.1
training 11.1
ball 11
boy 10.4
teamwork 10.2
human 9.7
clothing 9.7
outdoors 9.7
gymnasium 9.6
motion 9.4
professional 9.1
portrait 9.1
fun 9
student 8.8
child 8.7
grass 8.7
track 8.6
race 8.6
trainer 8.5
friendship 8.4
field 8.4
hand 8.3
fit 8.3
speed 8.2
healthy 8.2
sunset 8.1
recreation 8.1
graphic 8
family 8
body 8
love 7.9
jogging 7.9
athletics 7.8
athletic facility 7.8
party 7.7
summer 7.7
world 7.7
move 7.7
youth 7.7
outdoor 7.6
casual 7.6
dark 7.5
leisure 7.5
football helmet 7.4
style 7.4
sports 7.4
water 7.3
teenager 7.3
girls 7.3
businesswoman 7.3
posing 7.1
work 7.1
modern 7

Google
created on 2022-03-04

Font 81.8
Suit 77.8
Event 71.2
Player 69
Team 68.6
Monochrome 65.4
Crew 62.4
Team sport 62.1
Room 61.6
History 61.4
Chair 61.3
Uniform 58.4
Sitting 58.1
Vintage clothing 57.1
Photo caption 54.6
Class 54.4
Monochrome photography 52.5

Microsoft
created on 2022-03-04

text 99.7
person 96.1
clothing 92.3
man 88.2
black and white 87
group 79
posing 71.1
line 19.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 100%
Sad 68.9%
Calm 28.1%
Angry 0.7%
Fear 0.7%
Surprised 0.5%
Happy 0.4%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 48-56
Gender Male, 100%
Surprised 34.2%
Calm 28.5%
Happy 9.9%
Disgusted 7.2%
Confused 6.1%
Angry 5.6%
Sad 4.5%
Fear 3.9%

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Sad 93%
Calm 2.4%
Surprised 1.1%
Angry 1%
Fear 0.9%
Confused 0.7%
Disgusted 0.6%
Happy 0.3%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 38.6%
Happy 27.9%
Calm 9.8%
Surprised 7.4%
Angry 6.3%
Confused 5.1%
Disgusted 3.1%
Fear 1.7%

AWS Rekognition

Age 49-57
Gender Male, 99.3%
Sad 66.8%
Happy 11.9%
Angry 9.8%
Calm 3.2%
Surprised 3.2%
Fear 1.9%
Confused 1.7%
Disgusted 1.4%

AWS Rekognition

Age 45-53
Gender Male, 100%
Sad 48.7%
Calm 38.4%
Happy 4.4%
Angry 4.4%
Disgusted 1.4%
Confused 1.3%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 35-43
Gender Male, 100%
Calm 73.2%
Sad 7%
Surprised 6.8%
Angry 5.5%
Confused 2.6%
Happy 1.8%
Disgusted 1.7%
Fear 1.5%

AWS Rekognition

Age 48-54
Gender Male, 100%
Calm 67.9%
Happy 11.2%
Sad 9.4%
Surprised 4.5%
Confused 3%
Angry 1.6%
Fear 1.3%
Disgusted 1%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Sad 87.5%
Calm 7.7%
Confused 1.7%
Surprised 1.1%
Angry 0.8%
Disgusted 0.6%
Fear 0.4%
Happy 0.2%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 33%
Confused 27.3%
Sad 18.1%
Surprised 12.3%
Disgusted 3.9%
Angry 2%
Fear 1.7%
Happy 1.6%

AWS Rekognition

Age 41-49
Gender Male, 91.2%
Surprised 57.3%
Calm 22.4%
Happy 8%
Sad 7.9%
Disgusted 1.6%
Confused 1.6%
Angry 0.6%
Fear 0.6%

AWS Rekognition

Age 45-51
Gender Male, 99.6%
Calm 82.9%
Surprised 5.2%
Confused 4.6%
Angry 2.3%
Happy 1.9%
Sad 1.8%
Disgusted 0.9%
Fear 0.5%

AWS Rekognition

Age 45-51
Gender Male, 100%
Happy 38.9%
Sad 29.1%
Calm 11.6%
Surprised 9.3%
Angry 4.2%
Confused 3.8%
Disgusted 2%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Shoe
Person 99.3%
Person 99.1%
Person 98.4%
Person 97.8%
Person 97.6%
Person 95.9%
Person 95.8%
Person 95.5%
Person 95.4%
Person 95.3%
Person 93.8%
Person 93%
Person 92.8%
Person 92.7%
Person 59.8%
Tie 78.3%
Tie 74.1%
Shoe 71.5%
Shoe 52.3%

Text analysis

Amazon

NATIONAL
FIRST
THE
FIRST BANK NATIONAL
IN
BANK
88
71g
KIZH
503on
KINGDO