Human Generated Data

Title

Untitled (group of people, mostly children, using hula hoops)

Date

1958

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17974

Human Generated Data

Title

Untitled (group of people, mostly children, using hula hoops)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17974

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.3
Human 99.3
Person 99.1
Person 98.8
Person 98.7
Person 98.6
Person 97.8
Person 97.2
Person 97
Person 95
Tarmac 94.2
Asphalt 94.2
Person 93.8
People 88.8
Sport 86.9
Sports 86.9
Road 85.4
Field 85.1
Person 83.4
Shorts 75.5
Clothing 75.5
Apparel 75.5
Person 71.6
Person 69.9
Team Sport 65.6
Team 65.6

Clarifai
created on 2023-10-29

people 99.8
group together 99.3
many 98.5
uniform 97.1
military 96
street 95
soldier 94.1
group 93.8
man 92.2
war 87.7
police 87.5
crowd 86.9
adult 85.1
weapon 83.2
wear 81.1
vehicle 80.4
woman 80.2
gun 80
spectator 79.5
road 77.3

Imagga
created on 2022-03-04

runner 100
athlete 100
contestant 91.6
person 45.9
sport 33.5
road 20.8
sky 17.8
exercise 17.2
competition 16.5
recreation 16.1
outdoor 16
ball 15.7
travel 15.5
man 15.4
tennis 14.6
court 14.6
outdoors 13.6
fitness 13.5
day 13.3
net 12.3
summer 12.2
active 12.1
play 12.1
sports 12
field 11.7
line 11.3
action 11.1
leisure 10.8
transportation 10.8
game 10.7
trees 10.7
people 10.6
landscape 10.4
street 10.1
speed 10.1
light 10
horizon 9.9
highway 9.6
cloud 9.5
outside 9.4
lifestyle 9.4
tree 9.2
playing 9.1
sunset 9
fun 9
asphalt 8.8
urban 8.7
grass 8.7
male 8.5
car 8.5
vacation 8.2
scene 7.8
color 7.8
empty 7.7
run 7.7
running 7.7
direction 7.6
clouds 7.6
traffic 7.6
dark 7.5
park 7.4
transport 7.3
activity 7.2
mountain 7.1
sea 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

tree 98.6
outdoor 98.4
text 94.2
black and white 88.8
person 67.6
playground 53.8
several 16.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 90.4%
Calm 76.8%
Sad 18.6%
Angry 1.1%
Disgusted 1%
Confused 0.7%
Happy 0.7%
Fear 0.6%
Surprised 0.5%

AWS Rekognition

Age 25-35
Gender Female, 87.3%
Sad 67.1%
Calm 31.2%
Confused 0.6%
Happy 0.3%
Angry 0.3%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Male, 70.9%
Happy 67.3%
Sad 8.8%
Calm 6.3%
Confused 4.8%
Surprised 4.6%
Disgusted 3.6%
Fear 2.5%
Angry 2.1%

AWS Rekognition

Age 20-28
Gender Male, 94.5%
Calm 85.4%
Happy 8.2%
Fear 4%
Disgusted 1%
Angry 0.5%
Sad 0.5%
Surprised 0.2%
Confused 0.2%

AWS Rekognition

Age 16-22
Gender Male, 88.7%
Calm 63.8%
Fear 21.1%
Disgusted 3.1%
Sad 2.9%
Surprised 2.6%
Angry 2.6%
Confused 2.2%
Happy 1.8%

AWS Rekognition

Age 12-20
Gender Male, 91.9%
Calm 91.6%
Angry 2.9%
Fear 2.1%
Sad 1.5%
Surprised 0.8%
Happy 0.5%
Confused 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 99.1%
Person 98.8%
Person 98.7%
Person 98.6%
Person 97.8%
Person 97.2%
Person 97%
Person 95%
Person 93.8%
Person 83.4%
Person 71.6%
Person 69.9%