Human Generated Data

Title

Untitled (hauling wood)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1965

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (hauling wood)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1965

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Vehicle 94.2
Transportation 94.2
Car 89.5
Automobile 89.5
Buggy 76.2
Tire 74.8
Wheel 74.3
Machine 74.3
Shorts 73.8
Clothing 73.8
Apparel 73.8
Person 72.1
Tarmac 69.3
Asphalt 69.3
Sports Car 59.4
Wheel 58.8
Road 57
Spoke 55.9

Clarifai
created on 2023-10-25

people 99.9
vehicle 99.9
transportation system 99.2
group together 98.4
military vehicle 98.4
war 97.9
military 97.6
soldier 96.8
adult 95.1
two 94.2
driver 93.9
gun 92.1
car 92
weapon 91
skirmish 90.8
man 90.8
group 89.8
army 86.9
engine 86.9
monochrome 86.6

Imagga
created on 2022-01-08

model t 100
car 100
motor vehicle 94.2
vehicle 51.7
wheeled vehicle 36.8
sidecar 34
transportation 28.7
road 27.1
golf equipment 25.9
wheel 25.6
drive 23.6
equipment 21.3
driving 21.2
auto 20.1
transport 20.1
truck 19.9
sports equipment 19.4
machine 18.4
automobile 18.2
tractor 16.7
motor 16.4
sky 15.9
engine 14.4
outdoor 13.8
man 12.8
danger 12.7
sport 12.3
landscape 11.9
wheels 11.7
outdoors 11.2
speed 11
work 11
tire 10.7
driver 10.7
rural 10.6
grass 10.3
power 10.1
people 10
field 10
leisure 10
male 9.9
farm 9.8
accident 9.8
old 9.7
machinery 9.7
summer 9.6
lifestyle 9.4
safety 9.2
travel 9.1
adult 9.1
fun 9
wreck 8.8
broken 8.7
dirt 8.6
street 8.3
motorcycle 7.9
hood 7.9
sand 7.9
sitting 7.7
men 7.7
industry 7.7
vintage 7.4
land 7.4
metal 7.2
recreation 7.2
cart 7.1
autumn 7

Microsoft
created on 2022-01-08

outdoor 98.8
vehicle 93.2
land vehicle 92.5
military vehicle 88.7
wheel 80.5
car 79.5
transport 78.4
black and white 72.1
auto part 65.9
old 64.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 100%
Angry 34.2%
Fear 27.6%
Happy 20.8%
Calm 9.8%
Surprised 3.5%
Sad 1.8%
Confused 1.2%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Wheel 74.3%

Categories

Imagga

cars vehicles 100%

Captions

Text analysis

Amazon

10
HOOD
ERED
10 REGI
REGI
ERED 6
6

Google

HOOD O REGIS ERED 6
HOOD
O
REGIS
ERED
6