Human Generated Data

Title

Untitled (men lined up on grass, spectators on sideline watching sporting event)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13637

Human Generated Data

Title

Untitled (men lined up on grass, spectators on sideline watching sporting event)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13637

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.7
Human 99.7
Person 99.7
Person 98.3
Person 98.2
Person 98
Person 97.9
People 94.7
Person 93.5
Person 89.5
Person 88.4
Sport 87.4
Sports 87.4
Person 86.7
Clothing 86.4
Apparel 86.4
Person 85.7
Person 85.5
Person 83.5
Person 82.8
Person 80.8
Team Sport 78.7
Team 78.7
Football 77.8
Person 74.7
Person 71.6
Field 71
Person 70
Person 65.7
Person 60.5
Shorts 57.4
Cricket 55.2

Clarifai
created on 2023-10-27

people 99.8
military 98.8
war 98
group together 97.8
many 97.8
soldier 97.2
group 97.1
cemetery 95.9
funeral 95.6
man 94.8
adult 93.6
uniform 92.1
ceremony 91.7
woman 89.9
wedding 89.5
leader 88.8
monochrome 88.8
child 87.2
chair 85.5
army 85.3

Imagga
created on 2022-02-04

picket fence 39.7
fence 34.2
landscape 27.5
barrier 24
sky 23.1
park bench 19.8
tree 19.8
bench 19.6
structure 19.2
negative 18.2
film 18
winter 17.9
trees 17.8
obstruction 16.1
forest 15.7
fog 15.4
field 15.1
rural 15
park 14.8
old 14.6
scene 13.9
outdoor 13.8
snow 13.3
cold 12.9
horizon 12.6
outdoors 12.1
grunge 11.9
seat 11.7
environment 11.5
dark 10.9
vintage 10.8
cemetery 10.3
season 10.1
stage 10.1
countryside 10.1
country 9.7
black 9.6
grass 9.5
cloud 9.5
clouds 9.3
retro 9
summer 9
foggy 8.9
mist 8.7
photographic paper 8.7
antique 8.7
land 8.6
space 8.5
platform 8.2
morning 8.1
dirty 8.1
scenery 8.1
building 8.1
sunset 8.1
meadow 8.1
scenic 7.9
destruction 7.8
color 7.8
empty 7.7
furniture 7.7
texture 7.6
pattern 7.5
silhouette 7.5
smoke 7.4
light 7.4
travel 7
autumn 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 98.7
outdoor 95.7
grave 88.6
cemetery 88.2
tree 86.7
person 62.4
people 59.1
funeral 56.3
horse 13.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 65.8%
Calm 88.5%
Surprised 6.3%
Happy 1.8%
Sad 1.6%
Angry 0.8%
Disgusted 0.3%
Fear 0.3%
Confused 0.3%

AWS Rekognition

Age 38-46
Gender Male, 91.6%
Calm 96.7%
Sad 1%
Confused 0.5%
Happy 0.5%
Disgusted 0.5%
Angry 0.5%
Surprised 0.3%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 61.8%
Happy 47.7%
Calm 45.6%
Sad 2.5%
Surprised 1.8%
Angry 0.8%
Disgusted 0.8%
Confused 0.6%
Fear 0.2%

AWS Rekognition

Age 47-53
Gender Male, 76.5%
Calm 93.7%
Happy 4.9%
Surprised 0.4%
Sad 0.4%
Fear 0.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 41-49
Gender Male, 97.4%
Calm 99.8%
Happy 0.1%
Sad 0.1%
Fear 0%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 45-53
Gender Female, 55.7%
Calm 98.2%
Sad 0.9%
Happy 0.4%
Disgusted 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 24-34
Gender Male, 96.2%
Calm 58.1%
Happy 15.5%
Sad 7.6%
Angry 5%
Confused 4.7%
Disgusted 3.5%
Surprised 3.2%
Fear 2.5%

AWS Rekognition

Age 28-38
Gender Male, 63.5%
Calm 77%
Happy 13.9%
Sad 6.7%
Angry 0.9%
Confused 0.5%
Surprised 0.4%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 10-18
Gender Male, 69.5%
Calm 48.4%
Happy 44.7%
Sad 4.5%
Disgusted 1.3%
Angry 0.5%
Fear 0.2%
Confused 0.2%
Surprised 0.2%

AWS Rekognition

Age 35-43
Gender Male, 95.6%
Calm 96.4%
Happy 1.5%
Surprised 0.9%
Angry 0.6%
Sad 0.2%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.7%
Person 98.3%
Person 98.2%
Person 98%
Person 97.9%
Person 93.5%
Person 89.5%
Person 88.4%
Person 86.7%
Person 85.7%
Person 85.5%
Person 83.5%
Person 82.8%
Person 80.8%
Person 74.7%
Person 71.6%
Person 70%
Person 65.7%
Person 60.5%

Text analysis

Amazon

esp

Google

esA
esA