Human Generated Data

Title

Untitled (boys and girls hulahooping)

Date

1958, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.151

Human Generated Data

Title

Untitled (boys and girls hulahooping)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1958, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.151

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.7
Person 99.6
Person 99.6
Person 99.5
Person 99.3
Person 99.2
Person 98.9
Person 98.8
Dog 98
Mammal 98
Animal 98
Canine 98
Pet 98
Person 97.5
Person 97
Car 90.1
Transportation 90.1
Vehicle 90.1
Automobile 90.1
Toy 82.1
Leisure Activities 72.9
Crowd 72.1
Person 66.5
Hula 55.4

Clarifai
created on 2023-10-25

people 99.9
street 98.3
monochrome 97.6
group together 97.4
man 96.6
child 94.1
adult 93
group 92.2
woman 90.8
sport 90.4
many 90.2
spectator 87.9
crowd 87.7
uniform 86.6
boy 86.1
art 86
girl 83.9
music 83.6
school 83.3
athlete 82.5

Imagga
created on 2022-01-08

sword 55.6
pedestrian 53.8
weapon 52.6
man 24.2
park 21.4
maypole 21.2
people 20.6
outdoors 17.9
sport 17.2
male 17
snow 16.6
post 16.3
winter 16.2
outdoor 16.1
city 15.8
tree 15.8
couple 15.7
street 14.7
cold 13.8
upright 13.6
fun 13.5
day 13.3
boy 13
person 12.8
travel 12.7
grass 12.7
tourist 11.9
walk 11.4
walking 11.4
bench 11.1
adult 11
summer 10.9
season 10.9
exercise 10.9
recreation 10.8
forest 10.4
athlete 10.4
sky 10.2
happy 10
leisure 10
trees 9.8
landscape 9.7
women 9.5
love 9.5
men 9.4
outside 9.4
family 8.9
together 8.8
urban 8.7
path 8.5
structural member 8.5
fall 8.1
fitness 8.1
road 8.1
active 8.1
lifestyle 8
happiness 7.8
child 7.8
two 7.6
tourism 7.4
girls 7.3
group 7.3
activity 7.2
river 7.1
spring 7.1
autumn 7

Google
created on 2022-01-08

Tree 88.4
Black-and-white 85.5
Style 84
Musical instrument 77.2
Sky 75.7
Monochrome photography 75.2
Monochrome 75.1
Event 69.9
Plant 69.3
Building 66.8
Art 65.2
Stock photography 63.1
Recreation 57.6
Pedestrian 57.4
Musician 55.9
Hat 55.3
History 54.6
Team 53.4
Team sport 52.2
Street 51.2

Microsoft
created on 2022-01-08

outdoor 97.8
person 95.8
footwear 95.2
black and white 94.1
tree 91.9
text 86.5
clothing 86
man 82.8
dance 74.3
street 70.1
people 68.6
monochrome 68
sport 67.5
playground 61.8
group 58
several 11.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 72.5%
Happy 97.9%
Angry 0.6%
Calm 0.4%
Confused 0.3%
Sad 0.3%
Surprised 0.3%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 6-14
Gender Male, 99.9%
Sad 47%
Calm 29.3%
Fear 18.5%
Surprised 2.3%
Confused 0.8%
Angry 0.8%
Happy 0.7%
Disgusted 0.5%

AWS Rekognition

Age 19-27
Gender Female, 93.7%
Sad 92.8%
Calm 2.7%
Confused 2.7%
Angry 0.8%
Disgusted 0.5%
Surprised 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 4-12
Gender Male, 98.3%
Sad 49.7%
Calm 39.3%
Confused 8.5%
Angry 0.9%
Disgusted 0.6%
Surprised 0.4%
Happy 0.4%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Angry 97.9%
Calm 1.1%
Confused 0.5%
Happy 0.1%
Sad 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Male, 78.7%
Sad 95.6%
Calm 2.7%
Disgusted 0.3%
Confused 0.3%
Fear 0.3%
Angry 0.3%
Surprised 0.3%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Male, 95.1%
Calm 95.6%
Happy 1.5%
Sad 1.1%
Confused 0.7%
Disgusted 0.5%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 2-8
Gender Female, 85.2%
Calm 46.2%
Happy 25.1%
Sad 7.6%
Fear 5.8%
Surprised 5.6%
Confused 5.4%
Disgusted 3.4%
Angry 0.8%

AWS Rekognition

Age 16-24
Gender Female, 89.2%
Happy 23.9%
Calm 23.3%
Fear 19.3%
Sad 15.1%
Disgusted 6.2%
Surprised 5.6%
Confused 5.5%
Angry 1.1%

AWS Rekognition

Age 21-29
Gender Female, 97.4%
Fear 92.4%
Sad 4.4%
Calm 1.3%
Angry 0.8%
Confused 0.4%
Surprised 0.3%
Disgusted 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Dog 98%
Car 90.1%

Categories