Human Generated Data

Title

Untitled (family in run down kitchen)

Date

1957, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.185

Human Generated Data

Title

Untitled (family in run down kitchen)

People

Artist: Jack Gould, American

Date

1957, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.185

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.4
Person 98.2
Clinic 89.1
Urban 80
Person 76.6
People 65.6
Building 62.3
Workshop 58.3

Clarifai
created on 2023-10-25

people 100
group 98.8
group together 98.5
adult 98.3
man 97.9
two 96.9
woman 96.3
three 93.9
portrait 93.5
child 90.4
room 89.9
monochrome 89.3
documentary 89.2
home 89.1
family 89.1
boy 84.6
offspring 80.9
four 80.3
recreation 78.1
furniture 78

Imagga
created on 2021-12-14

shop 26.6
barbershop 20.3
window 18
people 16.7
man 16.1
person 16.1
mercantile establishment 15.9
business 15.2
working 15
sliding door 13.3
adult 13.1
computer 12.9
equipment 12.8
industry 12.8
office 12.8
building 12.7
transportation 12.5
city 12.5
interior 12.4
indoors 12.3
door 12.1
male 12
old 11.1
work 11.1
musical instrument 11
power 10.9
place of business 10.9
worker 10.9
urban 10.5
men 10.3
television 10.3
inside 10.1
black 9.6
room 9.6
glass 9.3
movable barrier 9.2
house 9.2
back 9.2
occupation 9.2
chair 9.1
technology 8.9
device 8.8
home 8.8
structure 8.5
travel 8.4
call 8.4
transport 8.2
indoor 8.2
one 8.2
steel 8
businessman 7.9
architecture 7.8
station 7.7
empty 7.7
stall 7.5
design 7.3
vehicle 7.2
accordion 7.1
women 7.1
barrier 7.1
job 7.1
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.7
clothing 98.6
person 94.9
man 82.8
footwear 77.7
black and white 76.9
posing 62.2
woman 55.5
old 42.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-65
Gender Female, 90%
Happy 90.9%
Calm 6.7%
Sad 0.7%
Surprised 0.7%
Confused 0.4%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 23-37
Gender Female, 82.6%
Happy 92.2%
Calm 3.6%
Sad 1.8%
Surprised 0.6%
Fear 0.6%
Disgusted 0.6%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 11-21
Gender Male, 87.1%
Calm 96.3%
Surprised 1.9%
Sad 0.5%
Happy 0.5%
Fear 0.3%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 33-49
Gender Male, 99.7%
Calm 96.2%
Sad 2.3%
Happy 1%
Angry 0.3%
Disgusted 0.1%
Confused 0.1%
Fear 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

paintings art 99.5%