Human Generated Data

Title

Untitled (baby on changing table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17034

Human Generated Data

Title

Untitled (baby on changing table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17034

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.8
Human 97.8
Face 86.5
Person 84.1
Person 80.3
Person 77.5
Person 69.2
Person 67.7
Portrait 66.4
Photography 66.4
Photo 66.4
Person 64.5
Train 63
Transportation 63
Vehicle 63
Room 59.2
Indoors 59.2
Person 52.6

Clarifai
created on 2023-10-29

monochrome 99.5
people 99.5
adult 97.2
vehicle 95.4
street 94.5
man 93.4
one 93
woman 90.6
car 90.5
transportation system 90.1
portrait 89.4
military 82
black and white 81.7
vehicle window 79.4
child 79.3
wear 78.7
two 77.6
police 76.5
old 76.3
war 76

Imagga
created on 2022-02-26

blackboard 100
technology 26
digital 25.9
3d 20.9
science 19.5
business 18.2
effects 18
three dimensional 16.8
computer 16.8
graphics 16.2
man 15.4
negative 14.8
imagination 14.2
finance 13.5
education 13
render 13
people 12.8
male 12.8
modern 12.6
financial 12.5
design 12.4
global 11.8
communication 11.7
graph 11.5
chart 11.5
black 11.4
success 11.3
motion 11.1
data 10.9
graphic 10.9
film 10.9
market 10.6
businessman 10.6
sign 10.5
person 10.3
line 10.3
board 9.9
human 9.7
information 9.7
medical 9.7
symbol 9.4
work 9.4
texture 9
hand 8.3
dollar 8.3
network 8.3
connection 8.2
student 8.1
currency 8.1
bright 7.9
figures 7.7
photographic paper 7.7
profit 7.6
health 7.6
equipment 7.6
tech 7.6
pattern 7.5
future 7.4
presentation 7.4
investment 7.3
screen 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98
black and white 91.7
black 69.9
old 67.8
concert 55.1
posing 48.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.8%
Happy 58.8%
Surprised 38.2%
Fear 2.3%
Angry 0.2%
Calm 0.2%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 24-34
Gender Male, 50.5%
Calm 94.3%
Sad 2.8%
Happy 1.9%
Surprised 0.4%
Angry 0.2%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 27-37
Gender Female, 77.6%
Happy 83.4%
Calm 12.9%
Angry 0.8%
Sad 0.8%
Confused 0.7%
Fear 0.6%
Surprised 0.4%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Train
Person 97.8%
Person 84.1%
Person 80.3%
Person 77.5%
Person 69.2%
Person 67.7%
Person 64.5%
Person 52.6%
Train 63%

Categories

Imagga

interior objects 99.7%

Captions

Text analysis

Amazon

NASON

Google

27
27