Human Generated Data

Title

Untitled (car ride)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1947

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (car ride)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1947

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wheel 99.2
Machine 99.2
Person 99
Human 99
Person 98.9
Wheel 96.7
Car 96.6
Automobile 96.6
Vehicle 96.6
Transportation 96.6
Tire 96.6
Wheel 94.4
Person 92.3
Person 87.1
Model T 86.9
Antique Car 86.9
Wheel 83.3
Person 80.7
Sports Car 80.2
Buggy 67.8
Car Wheel 62.8
Spoke 59.7
Brick 55.3

Clarifai
created on 2023-10-25

people 99.3
monochrome 98.9
vehicle 98.1
child 94.4
transportation system 93.3
two 91.7
car 91.1
vintage 88.7
man 87.8
wheel 86.1
tree 83.6
war 83.5
driver 83.1
black and white 82.8
analogue 81.9
group together 81.4
one 81.3
adult 81.1
street 80.4
machine 80.3

Imagga
created on 2022-01-08

vehicle 62.6
car 41.5
cannon 40.7
artillery 34.7
steamroller 33.8
field artillery 30.5
field 30.1
tractor 27
farm 25.9
armament 25.4
wheel 25.1
grass 24.5
gun 22.4
machinery 22.4
rural 22
agriculture 20.2
equipment 20
conveyance 19.6
motor vehicle 19.5
landscape 19.3
work 18.8
machine 18.2
sky 17.8
lawn mower 17.6
old 17.4
tire 17.2
high-angle gun 17.2
auto 16.3
weapon 16.2
farming 16.1
weaponry 15.9
outdoors 15.7
speed 15.6
transport 15.5
outdoor 15.3
hay 15
motor 14.6
man 14.1
harvest 14.1
garden tool 14.1
industry 13.7
driving 13.5
summer 13.5
drive 13.2
sport 13.2
tool 12.9
outside 12.8
road 12.6
dirt 12.4
working 12.4
farmer 11.8
power 11.7
engine 11.7
transportation 11.7
activity 11.6
wheeled vehicle 11.6
fun 11.2
land 11
earth 11
countryside 11
agricultural 10.7
truck 10.6
automobile 10.5
heavy 10.5
crop 10.3
fast 10.3
racer 10
feed 9.7
country 9.7
men 9.4
construction 9.4
yellow 9.3
industrial 9.1
farmland 8.7
race 8.6
action 8.3
autumn 7.9
scenic 7.9
sand 7.9
spring 7.8
tree 7.8
adult 7.8
extreme 7.7
go-kart 7.5
grain 7.4
wagon 7.3
people 7.2
lifestyle 7.2
meadow 7.2
travel 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.9
grass 99.7
tree 98.6
wheel 94.6
land vehicle 87.7
tire 84.7
vehicle 75.1
car 66.5
person 60.8
tractor 54.7
black and white 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Female, 100%
Happy 99.5%
Sad 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Calm 0%
Confused 0%

AWS Rekognition

Age 6-16
Gender Male, 97%
Fear 59.7%
Happy 13.6%
Surprised 10.8%
Sad 8.9%
Calm 2.4%
Confused 1.9%
Angry 1.7%
Disgusted 1%

AWS Rekognition

Age 42-50
Gender Male, 100%
Sad 100%
Fear 0%
Disgusted 0%
Angry 0%
Calm 0%
Confused 0%
Happy 0%
Surprised 0%

AWS Rekognition

Age 21-29
Gender Female, 95.8%
Happy 50.4%
Fear 22.6%
Sad 14.1%
Calm 6%
Disgusted 3.5%
Angry 1.6%
Confused 0.9%
Surprised 0.8%

AWS Rekognition

Age 28-38
Gender Male, 86.4%
Calm 43.8%
Sad 39.8%
Fear 14.2%
Confused 0.6%
Angry 0.6%
Happy 0.5%
Disgusted 0.4%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.2%
Person 99%

Text analysis

Amazon

2285