Human Generated Data

Title

Untitled (family in driveway)

Date

c. 1970

People

Artist: Bill Owens, American b. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1089

Human Generated Data

Title

Untitled (family in driveway)

People

Artist: Bill Owens, American b. 1938

Date

c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1089

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Person 98.6
Garage 96.8
Person 96.5
Chair 77.5
Furniture 77.5
Shoe 57.9
Clothing 57.9
Footwear 57.9
Apparel 57.9

Clarifai
created on 2023-10-25

people 99.8
home 98.4
two 97.5
man 96.4
adult 95.8
house 94.1
room 93.9
group together 93.8
woman 93.8
chair 93
street 92
family 91.5
vehicle 91.2
group 91
dog 90.8
porch 89.4
girl 87.2
one 86.1
wear 85.6
canine 84.9

Imagga
created on 2022-01-09

garage 100
house 35.1
home 21.5
building 19.6
architecture 19.5
old 18.1
wood 16.7
roof 15.2
structure 14.3
construction 13.7
sky 13.4
urban 13.1
window 11
patio 10.8
empty 10.3
city 10
rural 9.7
residential 9.6
estate 9.5
chair 9.5
industry 9.4
exterior 9.2
street 9.2
inside 9.2
travel 9.2
modern 9.1
business 9.1
park 9.1
door 9
transportation 9
farm 8.9
wall 8.5
winter 8.5
snow 8.2
shop 8.2
road 8.1
new 8.1
interior 8
wooden 7.9
dwelling 7.8
deck 7.8
residence 7.8
ancient 7.8
station 7.7
windows 7.7
outdoor 7.6
real 7.6
brick 7.5
room 7.5
area 7.4
work 7.1
furniture 7.1
country 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

building 99.5
outdoor 98.8
person 91.4
house 87
clothing 63.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 100%
Happy 99.3%
Surprised 0.5%
Angry 0.1%
Confused 0%
Calm 0%
Fear 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 4-10
Gender Female, 100%
Happy 94.7%
Calm 1.4%
Sad 1.4%
Angry 0.7%
Disgusted 0.6%
Fear 0.5%
Surprised 0.4%
Confused 0.2%

AWS Rekognition

Age 41-49
Gender Male, 99.6%
Happy 68.5%
Calm 18.3%
Confused 3.3%
Angry 3.1%
Disgusted 3%
Sad 1.9%
Fear 1%
Surprised 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Chair 77.5%
Shoe 57.9%

Categories

Text analysis

Amazon

UGT 517