Human Generated Data

Title

Untitled (man with doctor's bag walking next to car on country road)

Date

1951

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14749

Human Generated Data

Title

Untitled (man with doctor's bag walking next to car on country road)

People

Artist: Jack Gould, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14749

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 96.3
Human 96.3
Car 90.5
Transportation 90.5
Vehicle 90.5
Automobile 90.5
Clothing 88.5
Apparel 88.5
Tire 84.1
Spoke 83.2
Machine 83.2
Light 76.5
Car Wheel 71.9
Headlight 69.7
Alloy Wheel 67.5
Wheel 65.4
Mirror 58

Clarifai
created on 2023-10-27

wedding 99.7
monochrome 99.7
people 99.4
bride 99.4
winter 97.2
groom 97.2
snow 96.9
veil 96.8
girl 96.2
street 96.1
black and white 96.1
man 95.5
woman 95.4
portrait 94.4
one 94.1
adult 93
lid 92.7
cold 91.6
couple 90.1
marriage 89.8

Imagga
created on 2022-01-29

car 48.8
vehicle 35.6
automobile 26.8
transportation 25.1
drive 22.7
person 22.6
auto 22
road 21.7
travel 19.7
people 18.4
driver 17.5
happy 15.7
man 15.5
portrait 15.5
adult 14.9
transport 14.6
window 14.2
outdoors 13.5
pretty 13.3
bride 13.3
groom 12.9
smile 12.8
love 12.6
driving 12.6
happiness 12.5
fashion 12
wedding 11.9
summer 11.6
wheel 11.4
male 11.3
outdoor 10.7
attractive 10.5
smiling 10.1
street 10.1
device 10.1
dress 9.9
white 9.9
hand 9.9
vessel 9.8
door 9.7
sky 9.6
traffic 9.5
cheerful 8.9
sexy 8.8
looking 8.8
hair 8.7
seat 8.7
couple 8.7
married 8.6
cute 8.6
business 8.5
bucket 8.4
joy 8.3
speed 8.2
one 8.2
water 8
machine 8
life 8
businessman 7.9
container 7.8
model 7.8
men 7.7
wife 7.6
trip 7.5
fun 7.5
safety 7.4
lady 7.3
success 7.2
black 7.2
sunset 7.2
open 7.2

Microsoft
created on 2022-01-29

outdoor 87.8
text 86.1
white 82.1
black 69.9
black and white 68.3
old 61.9
vintage 27.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 98.7%
Calm 99.4%
Sad 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%
Happy 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Wheel
Person 96.3%
Car 90.5%
Wheel 65.4%

Categories

Imagga

paintings art 99.8%