Human Generated Data

Title

Untitled (group of men with hunted deer)

Date

c. 1950

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2050

Human Generated Data

Title

Untitled (group of men with hunted deer)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.6
Person 99.5
Person 98.8
Person 98.6
Person 98.3
Person 98.3
Person 97.6
Person 95.9
Person 95.8
Person 95.6
Person 94.1
Mammal 79.1
Animal 79.1
Person 71.1
Hog 64.9
Pig 64.9
Person 63.3
Sunglasses 63.1
Accessories 63.1
Accessory 63.1
Person 58.9
Butcher Shop 56.7
Shop 56.7
Clothing 55.8
Apparel 55.8
People 55.5

Imagga
created on 2021-12-14

picket fence 40.3
fence 34.4
shovel 27.3
snow 26.1
barrier 25
landscape 21.6
hand tool 21.3
tool 20.4
tree 19.6
winter 19.6
drawing 19.3
grunge 16.2
obstruction 16.1
sky 15.3
old 14.6
structure 14.5
weather 14.4
silhouette 14.1
forest 13.9
sketch 13.5
rural 13.2
trees 12.4
outdoor 12.2
water 12
wood 11.7
black 11.4
cold 11.2
field 10.9
man 10.7
park 10.7
people 10.6
scene 10.4
beach 10.4
design 10.1
dirty 9.9
vintage 9.9
scenery 9.9
outdoors 9.8
plant 9.6
river 8.9
sport 8.6
frost 8.6
day 8.6
dark 8.3
sand 8.3
natural 8
light 8
mountain 8
male 7.8
season 7.8
representation 7.7
musical instrument 7.7
summer 7.7
tracing 7.5
frame 7.5
retro 7.4
backgrounds 7.3
art 7.3
sun 7.2
sunset 7.2
activity 7.2
grass 7.1
travel 7
architecture 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

outdoor 96.3
person 90.4
clothing 86.7
man 79.7
group 64.1
tree 62.8
white 60.5
old 59.6
posing 58.8
drawing 58.8
people 57.8

Face analysis

Amazon

Google

AWS Rekognition

Age 30-46
Gender Male, 83.2%
Calm 87.9%
Happy 4.7%
Angry 3.5%
Sad 1.8%
Confused 1.2%
Surprised 0.5%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 88.6%
Sad 74.7%
Calm 13.7%
Happy 6.4%
Angry 3.3%
Confused 1.6%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 36-52
Gender Male, 74%
Sad 64.7%
Calm 27%
Angry 2.3%
Happy 2.2%
Fear 2.1%
Confused 1.3%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Male, 69.5%
Calm 72.2%
Sad 26.2%
Happy 0.9%
Angry 0.4%
Confused 0.2%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-42
Gender Female, 66.2%
Calm 80.9%
Sad 9%
Happy 8.2%
Angry 1.2%
Fear 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 21-33
Gender Male, 58.7%
Calm 59.6%
Sad 31.1%
Angry 3.8%
Confused 2.2%
Happy 2.1%
Surprised 0.4%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 50-68
Gender Male, 64.2%
Calm 47.3%
Sad 46.9%
Happy 3.3%
Angry 1.6%
Confused 0.5%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 78.7%
Calm 54.4%
Sad 31.1%
Happy 9.3%
Angry 4.1%
Confused 0.7%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Female, 71.8%
Calm 45.7%
Sad 43.3%
Happy 8.1%
Angry 1.1%
Fear 1%
Confused 0.4%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 26-40
Gender Male, 72.5%
Sad 42.3%
Angry 35%
Calm 17.9%
Fear 2%
Confused 1.6%
Happy 0.6%
Disgusted 0.4%
Surprised 0.2%

AWS Rekognition

Age 47-65
Gender Male, 90.9%
Calm 39.9%
Sad 38.2%
Happy 14.6%
Fear 2%
Angry 2%
Surprised 1.5%
Confused 1.5%
Disgusted 0.2%

AWS Rekognition

Age 37-55
Gender Female, 67%
Calm 43.6%
Sad 31.1%
Happy 14.9%
Fear 3%
Angry 2.5%
Disgusted 2.3%
Surprised 2.1%
Confused 0.6%

AWS Rekognition

Age 53-71
Gender Male, 72.5%
Sad 62.8%
Calm 34%
Confused 1.6%
Angry 0.5%
Surprised 0.5%
Happy 0.3%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Sunglasses 63.1%

Captions

Microsoft

a vintage photo of a group of people posing for a picture 90%
a vintage photo of a group of people posing for the camera 88.2%
a group of people posing for a photo 88.1%

Text analysis

Amazon

KODOK-&VEELA