Human Generated Data

Title

Untitled (young couple dressed as farmers posing in fake farmhouse)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2581

Human Generated Data

Title

Untitled (young couple dressed as farmers posing in fake farmhouse)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Chair 99.9
Furniture 99.9
Person 99.6
Human 99.6
Person 99.3
Shoe 87.6
Clothing 87.6
Apparel 87.6
Footwear 87.6
Outdoors 64.5
Photography 62.6
Photo 62.6
People 61.6
Face 61
Text 60.4
Shoe 59.5
Shoe 58.2
Brick 57
Nature 56.4
Yard 56.4
Drawing 55.3
Art 55.3
Shorts 55.1

Imagga
created on 2022-01-15

shopping cart 100
handcart 100
wheeled vehicle 100
container 71
cart 35.1
shopping 34.9
chair 34.4
conveyance 33.5
buy 29.1
empty 24
basket 23.2
shop 23.1
store 22.7
supermarket 22.5
market 22.2
seat 22.1
metal 20.1
furniture 18.8
retail 18
trolley 17.7
sale 16.6
table 15.7
push 15.2
interior 14.1
chairs 13.7
buying 13.5
object 13.2
house 12.5
purchase 12.5
wheel 12.3
commerce 12.1
home 12
relaxation 11.7
wood 11.7
trade 11.5
rocking chair 11.4
business 10.9
nobody 10.9
checkout 10.9
commercial 10.3
summer 10.3
decoration 10.1
room 10.1
pushcart 9.9
grocery 9.7
customer 9.5
architecture 9.4
relax 9.3
outdoor 9.2
mall 8.8
lifestyle 8.7
holiday 8.6
outside 8.6
3d 8.5
sit 8.5
rest 8.5
design 8.4
modern 8.4
floor 8.4
leisure 8.3
metallic 8.3
plastic 8.3
relaxing 8.2
style 8.2
sun 8.1
grass 7.9
urban 7.9
food 7.9
trading 7.8
consumer 7.8
sitting 7.7
wall 7.7
money 7.7
sky 7.7
dining 7.6
shopping basket 7.5
chrome 7.5
park 7.4
inside 7.4
window 7.3
full 7.3
silver 7.1
indoors 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 95.6
outdoor 92.2
black and white 66.4
furniture 66.2
chair 51.6

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 96.6%
Happy 84.7%
Calm 9.2%
Surprised 2.4%
Angry 1.5%
Sad 1%
Confused 0.5%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 23-31
Gender Female, 99.7%
Sad 55.6%
Calm 42.9%
Happy 0.5%
Fear 0.2%
Angry 0.2%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 87.6%

Captions

Microsoft

a person sitting in front of a building 58.2%
a person sitting in front of a building 58.1%
a person sitting in a chair in front of a building 58%

Text analysis

Amazon

U
ق
NJE
حعمIا