Human Generated Data

Title

Untitled (double image of rephotographed early portrait of young man in military uniform)

Date

1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10043

Human Generated Data

Title

Untitled (double image of rephotographed early portrait of young man in military uniform)

People

Artist: Martin Schweig, American 20th century

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Head 96.8
Human 96.7
Person 96.7
Person 96.5
Face 86.9
Art 81.6
Furniture 75.5
Drawing 73.5
Photo 66
Photography 66
Portrait 65.8
Apparel 65.1
Clothing 65.1
Text 65
Painting 58.7
Advertisement 58.1
Poster 58.1
Dish 57.8
Food 57.8
Meal 57.8
Collage 57.7
Sketch 57

Imagga
created on 2022-01-29

newspaper 100
product 86.8
creation 68.1
money 46
currency 43.1
cash 42.1
dollar 39
paper 38.4
daily 34
wealth 30.5
finance 30.4
banking 30.4
bank 29.6
business 29.2
vintage 29
bill 28.6
dollars 27.1
hundred 26.2
stamp 25.2
financial 24.1
mail 23.9
postmark 23.7
one 23.2
us 23.1
letter 22.9
savings 22.4
old 22.3
envelope 20.1
exchange 20.1
close 20
retro 19.7
rich 19.6
circa 18.8
postal 18.6
pay 18.2
loan 18.2
ancient 18.2
franklin 17.7
printed 17.7
postage 17.7
bills 17.5
states 17.4
shows 16.7
investment 16.5
banknotes 15.7
closeup 15.5
aged 15.4
philately 14.8
note 14.7
finances 13.5
symbol 13.5
card 13.3
message 12.8
banknote 12.6
notes 12.5
post 12.4
united 12.4
sign 12
funds 11.8
market 11.6
treasury 10.6
success 10.5
capital 10.4
stamps 9.9
wages 9.8
art 9.5
drawing 9.3
economy 9.3
global 9.1
sketch 9.1
book jacket 9
twenty 8.9
president 8.8
legal 8.8
antique 8.8
payment 8.7
profit 8.6
design 8.4
commerce 8.4
object 8.1
concepts 8
jacket 7.9
100 7.9
address 7.8
face 7.8
debt 7.7
culture 7.7
price 7.7
fine 7.6
unique 7.6
buy 7.5
stock 7.5
number 7.5
security 7.4
office 7.2
history 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

drawing 98.2
human face 96.8
sketch 96.3
text 96
person 90.1
man 90
handwriting 67.1
posing 65.9
old 64.5

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 51.5%
Sad 57.3%
Calm 33.4%
Angry 3.9%
Confused 2.3%
Surprised 1.3%
Happy 0.7%
Disgusted 0.6%
Fear 0.4%

AWS Rekognition

Age 26-36
Gender Male, 61.6%
Sad 50.5%
Calm 42.2%
Angry 2.2%
Surprised 1.9%
Confused 1.7%
Happy 0.8%
Disgusted 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%

Captions

Microsoft

a vintage photo of a person 71.1%
an old photo of a person 70.6%
a vintage photo of some people posing for the camera 47.1%

Text analysis

Amazon

3
2
21 5/21 3 2 (3
(3
21
5/21

Google

१) 3 2 ৫
3
१)
2