Human Generated Data

Title

Untitled (couple seated on tracter)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2857

Human Generated Data

Title

Untitled (couple seated on tracter)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2857

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.7
Human 99.7
Person 99
Carriage 96.6
Transportation 96.6
Vehicle 96.6
Spoke 95.7
Machine 95.7
Wheel 94.7
Wagon 88.7
Boat 73.6
Female 63.6
Buggy 63.2
People 61.1
Tire 60.9
Sled 60.4

Clarifai
created on 2023-10-26

people 99.9
two 99
vehicle 98.9
group together 98.7
military 98.6
group 98.3
war 98.1
adult 97.6
gun 97.3
man 97.2
soldier 97
transportation system 96.6
weapon 95.9
army 95.9
three 95.4
cannon 95.2
child 94.8
four 93.1
wheel 91.7
cavalry 91.4

Imagga
created on 2022-01-16

wheelchair 72
chair 60.4
seat 44.4
furniture 25.2
brass 23.8
man 20.1
people 19.5
vehicle 19.2
wind instrument 18.1
work 18
beach 15.4
water 14.7
male 13.5
musical instrument 13.4
old 13.2
furnishing 13.1
wheeled vehicle 12.7
fun 12.7
transportation 12.5
travel 12
tricycle 11.9
outdoors 11.9
sea 11.7
sky 11.5
adult 11
person 10.8
holiday 10.7
outdoor 10.7
vacation 10.6
happy 10
wicker 9.7
summer 9.6
outside 9.4
lifestyle 9.4
carriage 9.3
tourism 9.1
black 9
technology 8.9
ocean 8.7
boy 8.7
love 8.7
wheel 8.6
play 8.6
men 8.6
winter 8.5
sport 8.4
help 8.4
leisure 8.3
transport 8.2
snow 7.9
smile 7.8
bike 7.8
sitting 7.7
human 7.5
landscape 7.4
season 7

Microsoft
created on 2022-01-16

outdoor 99.5
text 96.9
horse 93.8
person 89.3
black 88.2
posing 85.3
old 84.8
drawn 76.7
carriage 66.6
clothing 59.3
man 54.8
cart 47.3
team 32
vintage 30
bicycle 26.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 98.9%
Calm 97.3%
Confused 0.7%
Happy 0.5%
Sad 0.5%
Angry 0.4%
Disgusted 0.4%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 59.5%
Calm 54.6%
Surprised 17.4%
Happy 14.5%
Fear 5.8%
Confused 2.8%
Angry 1.8%
Sad 1.6%
Disgusted 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 94.7%
Boat 73.6%

Text analysis

Amazon

KODVK-SVEELA

Google

YT33A2 -YAGOX
YT33A2
-YAGOX