Human Generated Data

Title

Untitled (three men posed sitting on or beside tractor in field)

Date

c. 1930-1945

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10927

Human Generated Data

Title

Untitled (three men posed sitting on or beside tractor in field)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1930-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10927

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Wheel 99.9
Machine 99.9
Person 99.7
Human 99.7
Person 99.4
Person 88.7
Engine 80.8
Motor 80.8
Tire 70.9
Spoke 70.5
Car Wheel 56.2

Clarifai
created on 2023-10-29

people 99.9
vehicle 98.3
adult 98.1
man 96.9
group 96.4
transportation system 95.8
group together 94.5
machine 92
woman 91.4
child 84.5
grinder 83.9
tractor 83.7
war 83.5
wheel 82.1
two 80.6
raw material 80.4
many 80
industry 79.2
three 76.7
actor 76.1

Imagga
created on 2022-02-05

silhouette 16.6
light 15.4
old 14.6
dark 14.2
man 14.1
sky 13.4
vehicle 12.8
sunset 12.6
fire 12.2
black 12
night 11.5
machine 11.5
wheeled vehicle 11.3
device 10.9
musical instrument 10.6
construction 10.3
work 10.2
industrial 10
person 9.8
people 9.5
industry 9.4
architecture 9.4
cannon 9.4
landscape 8.9
sun 8.9
metal 8.8
gun 8.7
evening 8.4
energy 8.4
power 8.4
hot 8.4
building 8.2
working 8
destruction 7.8
disaster 7.8
male 7.8
travel 7.7
stone 7.7
outdoor 7.6
dirt 7.6
house 7.5
cart 7.5
tractor 7.5
heat 7.4
water 7.3
protection 7.3
danger 7.3
religion 7.2
farm 7.1
equipment 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.3
person 86.7
black and white 84.5
man 83.4
black 77.4
white 71.3
old 71.1
wheel 60.8
outdoor object 33.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 57%
Happy 30%
Confused 3.5%
Sad 2.6%
Disgusted 2%
Surprised 1.9%
Fear 1.8%
Angry 1.3%

AWS Rekognition

Age 28-38
Gender Male, 99.4%
Calm 83.7%
Sad 14%
Confused 1%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%
Happy 0.2%
Surprised 0.1%

AWS Rekognition

Age 29-39
Gender Female, 69%
Calm 53.7%
Fear 30.5%
Angry 5.3%
Confused 2.7%
Disgusted 2.2%
Sad 2.1%
Surprised 2%
Happy 1.5%

Feature analysis

Amazon

Wheel
Person
Wheel 99.9%
Person 99.7%
Person 99.4%
Person 88.7%

Captions

Microsoft
created on 2022-02-05

an old photo of a person 88.8%
old photo of a person 87.4%
a old photo of a person 85.4%