Human Generated Data

Title

Untitled (group of Lockhart men working of farm equipment)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2412

Human Generated Data

Title

Untitled (group of Lockhart men working of farm equipment)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2412

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Wheel 99.7
Machine 99.7
Person 98.7
Human 98.7
Building 96.2
Person 96.2
Person 90.2
Person 89.6
Clothing 77.7
Apparel 77.7
Buggy 72.1
Vehicle 72.1
Transportation 72.1
Tire 65.5
Housing 64.1
Countryside 62.2
Nature 62.2
Outdoors 62.2
Automobile 61.2
Spoke 60.1
Car 58

Clarifai
created on 2023-10-28

people 99.9
vehicle 99.6
transportation system 99
adult 97.1
two 97
group 96.6
group together 96.4
aircraft 96
man 96
four 95.1
three 94.6
many 93.8
several 92.8
car 89.7
five 89.7
recreation 87.4
child 86.3
biplane 86.2
airplane 81.3
woman 80.5

Imagga
created on 2022-01-30

vehicle 78.8
car 48.1
wheeled vehicle 37.7
motor vehicle 30.9
military vehicle 29.2
transportation 26.9
half track 26.6
conveyance 24.5
tracked vehicle 24.3
machine 22.4
wheel 21.8
drive 18.9
transport 18.3
truck 17.9
equipment 17.9
old 17.4
grass 16.6
machinery 16.6
driving 16.4
industrial 16.3
sky 15.9
industry 15.4
auto 15.3
tractor 15.3
man 14.8
golf equipment 14.3
work 14.1
outdoors 13.4
road 12.6
farm 12.5
construction 12
outside 12
tank 11.5
automobile 11.5
rural 11.4
sports equipment 11.4
male 11.3
building 11.2
sport 11
field 10.9
vintage 10.7
steamroller 10.7
outdoor 10.7
dirt 10.5
power 10.1
speed 10.1
tire 9.9
travel 9.9
heavy 9.5
tool 9.4
yellow 9.3
landscape 8.9
track 8.9
metal 8.8
structure 8.4
people 8.4
summer 8.4
house 8.4
bulldozer 8.3
city 8.3
land 8.3
course 8.2
fun 8.2
snow 8.2
retro 8.2
activity 8.1
lawn mower 8
agriculture 7.9
broken 7.7
winter 7.7
sand 7.4
action 7.4
environment 7.4
working 7.1
country 7

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

outdoor 99.3
text 93.8
black and white 84.9
wheel 77.9
vehicle 76.2
land vehicle 76.2
old 43.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 99.1%
Calm 86.9%
Sad 9.5%
Angry 1.5%
Confused 1.1%
Surprised 0.5%
Disgusted 0.3%
Happy 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel
Person
Car
Wheel 99.7%
Person 98.7%
Person 96.2%
Person 90.2%
Person 89.6%
Car 58%

Text analysis

Amazon

LOCKHART
ART
SAC
FIANERS
CEUS
UR FIANERS
URHART
EA
UR
YT37A°2-XAGOX

Google

SAI YT37A°2- AO EAMERS LOCKMART (EMS ORHART ART
SAI
YT37A°2-
AO
EAMERS
LOCKMART
(EMS
ORHART
ART