Human Generated Data

Title

Untitled (woman seated at decorated desk)

Date

1948

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1552

Human Generated Data

Title

Untitled (woman seated at decorated desk)

People

Artist: John Deusing, American active 1940s

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 96.6
Human 96.6
Indoors 94.5
Room 94.4
Interior Design 92.9
Text 88.4
Bedroom 82.8
Dorm Room 77.9
Person 77.4
People 67.3
Female 65
Workshop 57.2
Furniture 56.8

Imagga
created on 2021-12-14

architecture 30.8
building 26.8
structure 24.3
daily 21.5
city 20.8
newspaper 20.2
urban 19.2
supermarket 19.2
construction 18.8
modern 18.2
business 17.6
sketch 17.3
product 16.4
drawing 16.4
house 15.9
interior 14.1
equipment 13.8
industry 13.7
design 13.5
engineering 13.3
office 13
floor 13
project 12.5
grocery store 12.4
creation 12.1
mercantile establishment 12
travel 12
old 11.8
industrial 11.8
steel 11.5
plan 11.3
new 11.3
metal 11.3
home 11.2
shop 11.1
sky 10.8
architect 10.6
development 10.5
window 10.3
exterior 10.1
stall 10
tower 9.8
technology 9.6
negative 9.4
light 9.4
glass 9.3
3d 9.3
marketplace 9.3
stone 9.3
inside 9.2
frame 9.1
wall 8.7
work 8.6
lamp 8.6
perspective 8.5
street 8.3
tourism 8.2
center 8.2
history 8
working 7.9
indoors 7.9
people 7.8
bridge 7.8
house of cards 7.7
built 7.7
architectural 7.7
apartment 7.7
residential 7.7
roof 7.6
buildings 7.6
site 7.5
film 7.4
town 7.4
water 7.3
transport 7.3

Google
created on 2021-12-14

Black 89.5
Building 88.5
Black-and-white 85.3
Style 84
Adaptation 79.2
Tent 75.5
Monochrome 74.4
Monochrome photography 74.4
Font 70.9
Art 70
Facade 69.4
Visual arts 67.9
Pattern 66.8
Rectangle 64.3
Room 64.1
Stock photography 62.1
House 55.5
Shade 53.5
History 51.4

Microsoft
created on 2021-12-14

text 98.8
window 92.6
black and white 88.9
person 81.4
drawing 72.9
clothing 51.5

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Female, 65.4%
Happy 65.1%
Calm 15.3%
Sad 14.5%
Angry 1.4%
Fear 1.3%
Confused 1.1%
Surprised 1%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.6%

Captions

Microsoft

a group of people sitting in front of a window 56.4%
a group of people in front of a window 56.3%
a group of people standing in front of a window 56.2%

Text analysis

Amazon

399
THIS
3A-5999
5050
2A 5050
CAPE
BACKING
IA 399
ZA-6007
2A
FOR
2A-5750
246042
IA
RA
ZA-5970
THIS FR FOR YOUR
24-5751
A70A
ZA-584
2A-S804
3A 839
YOUR
TA-5091
Un
24-3044
2AS
24-
Un BACKING aut AGIR
MJI7 YESTAD A70A
محمد
3A-4581
MJI7
2A-60
2A·S
IAM800
2A:604
ZA-
PA-6940
on
YESTAD
aut
AGIR
FR
12

Google

YT37A2
SA-1581
SA-1581 MJ17 YT37A2 A7J A
MJ17
A7J
A