Human Generated Data

Title

Untitled (man reading a document)

Date

c. 1892-c. 1905

People

Artist: Sarah Choate Sears, American 1858 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.66

Human Generated Data

Title

Untitled (man reading a document)

People

Artist: Sarah Choate Sears, American 1858 - 1935

Date

c. 1892-c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.66

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98
Text 93.6
Person 87.6
Reading 70.8
Suit 56.8
Clothing 56.8
Coat 56.8
Overcoat 56.8
Apparel 56.8

Clarifai
created on 2023-10-27

people 99.7
portrait 99.3
one 98.6
sepia 98.4
vintage 98.4
sepia pigment 97.6
art 97.5
paper 97.4
adult 97.2
man 97.2
retro 96.8
wear 96.6
book bindings 95.4
old 95.2
woman 90.9
two 90.1
antique 89.7
page 88.6
monochrome 88
nostalgia 86.8

Imagga
created on 2022-01-23

carton 52.3
box 46.9
container 31.7
paper 23.8
toilet tissue 23.8
tissue 18.9
business 18.2
computer 17.3
laptop 16.9
device 15.4
man 14.8
office 13.7
home 12.8
person 11.7
people 11.7
adult 11.6
black 11.4
male 10.6
technology 10.4
holding 9.9
suit 9.9
hand 9.9
sofa 9.9
modern 9.8
interior 9.7
working 9.7
businessman 9.7
close 9.7
room 9.6
money 9.4
finance 9.3
information 8.9
work 8.6
job 8
smiling 8
support 7.9
sitting 7.7
empty 7.7
keyboard 7.7
happy 7.5
present 7.3
businesswoman 7.3
open 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.4
book 90.2
person 79.2
human face 75.8
letter 67
clothing 64.2
black and white 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 79.1%
Calm 87.1%
Happy 7.6%
Sad 4.7%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 87.6%

Categories

Imagga

paintings art 88.5%
food drinks 6.4%
interior objects 4.1%

Captions

Microsoft
created on 2022-01-23

a close up of a person 29.7%
a photo of a person 29.6%