Human Generated Data

Title

Portraits of Salespeople

Date

1973

People

Artist: Allan Sekula, American 1951 - 2013

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2017.184

Copyright

© Allan Sekula Studio

Human Generated Data

Title

Portraits of Salespeople

People

Artist: Allan Sekula, American 1951 - 2013

Date

1973

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2017.184

Copyright

© Allan Sekula Studio

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Collage 100
Advertisement 100
Person 99.3
Human 99.3
Person 99.3
Person 99
Person 98.7
Person 98.7
Person 98.5
Person 97.9
Person 96.3
Person 95.7
Person 94.8
Person 92.5
Person 85.5
Person 81.4
Person 78.4
Person 77.9
Poster 73
Person 70.7
Person 70.5
Person 68.9
Person 62.9
Person 47.5

Clarifai
created on 2019-04-10

people 99.8
group 99.1
many 98.5
print 98
monochrome 97.1
adult 97.1
man 97
no person 95.7
retro 94.6
one 93.2
wear 91.8
collection 90.1
several 88.4
vintage 88.1
woman 88.1
set 87.9
illustration 87.7
art 86
nostalgia 84.4
collage 84.1

Imagga
created on 2018-12-17

set 24.7
symbol 22.9
icon 20.6
paper 20.2
design 19.8
frame 19.4
notebook 19.4
business 18.8
sign 18.8
blank 17.1
collection 17.1
graphic 16.1
icons 15.8
card 15.7
money 15.3
element 14.1
galley 13.9
diskette 13.6
art 13.6
old 13.2
vessel 12.8
currency 12.6
office 12.1
product 12
letter 11.9
object 11.7
retro 11.5
black 11.4
box 11.3
dollar 11.1
magnetic disk 10.6
post 10.5
finance 10.1
cash 10.1
vehicle 10
magazine 9.8
stamp 9.8
mail 9.6
home 9.6
decoration 9.5
memory device 9.4
grunge 9.4
creation 9.4
web 9.3
drawing 9.3
communication 9.2
silhouette 9.1
craft 9
text 8.7
binder 8.5
device 8.4
modern 8.4
page 8.4
vintage 8.3
technology 8.2
board 8.1
financial 8
facility 7.8
album 7.8
mark 7.7
flower 7.7
letterhead 7.7
exchange 7.6
container 7.6
sketch 7.4
book 7.4
note 7.4
message 7.3
stationery 7.3
square 7.2
border 7.2
computer 7.2
colors 7.1

Google
created on 2018-12-17

Microsoft
created on 2018-12-17

gallery 99.3
scene 94.7
room 92.9
library 92.9
print 75.5
postal 72.5
street 70.3
black and white 63.9
scan 63.1
man 52.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 55%
Sad 45.6%
Angry 45.2%
Calm 53.9%
Surprised 45%
Happy 45.1%
Disgusted 45%
Confused 45.2%

AWS Rekognition

Age 57-77
Gender Female, 52.4%
Disgusted 45.6%
Sad 45.5%
Confused 46.1%
Angry 45.8%
Happy 46.4%
Calm 50.1%
Surprised 45.4%

AWS Rekognition

Age 26-43
Gender Female, 99.6%
Angry 0.4%
Disgusted 0.6%
Sad 0.2%
Happy 97.1%
Confused 0.4%
Calm 0.3%
Surprised 1%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Surprised 45.3%
Calm 48.7%
Angry 45.8%
Sad 46.5%
Happy 46.8%
Confused 46.2%
Disgusted 45.6%

AWS Rekognition

Age 48-68
Gender Male, 54.9%
Confused 45.2%
Surprised 45.1%
Sad 45.9%
Calm 47.1%
Angry 45.4%
Disgusted 45.1%
Happy 51.2%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Surprised 45.3%
Happy 48.6%
Calm 47.9%
Sad 46.6%
Disgusted 45.5%
Confused 45.6%
Angry 45.5%

AWS Rekognition

Age 26-43
Gender Female, 54.7%
Angry 45.1%
Surprised 45.1%
Sad 45.1%
Happy 53.7%
Confused 45.1%
Calm 45.9%
Disgusted 45.1%

AWS Rekognition

Age 48-68
Gender Male, 53.1%
Confused 45%
Angry 45.1%
Surprised 45%
Calm 45.1%
Disgusted 45%
Happy 45%
Sad 54.8%

AWS Rekognition

Age 20-38
Gender Female, 53.4%
Calm 45%
Sad 45.2%
Happy 54.5%
Surprised 45%
Angry 45.1%
Confused 45.1%
Disgusted 45%

AWS Rekognition

Age 20-38
Gender Male, 50.3%
Calm 49.6%
Angry 49.5%
Surprised 49.5%
Sad 49.5%
Happy 50.3%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 57-77
Gender Female, 54.6%
Confused 45.1%
Disgusted 45.1%
Sad 53.1%
Angry 45.2%
Calm 46%
Happy 45.5%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Female, 55%
Angry 45.1%
Sad 45.1%
Disgusted 45.3%
Surprised 45.5%
Happy 45.7%
Confused 45.1%
Calm 53.1%

AWS Rekognition

Age 30-47
Gender Female, 53.8%
Disgusted 45.1%
Sad 45.2%
Confused 45.1%
Angry 45.2%
Happy 53.9%
Calm 45.3%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Sad 49.5%
Happy 49.5%
Confused 49.5%
Disgusted 50.4%
Surprised 49.5%
Angry 49.5%
Calm 49.5%

AWS Rekognition

Age 30-47
Gender Male, 54.1%
Calm 49.8%
Surprised 45.3%
Angry 45.4%
Happy 48.5%
Disgusted 45.4%
Confused 45.2%
Sad 45.4%

AWS Rekognition

Age 26-44
Gender Male, 54.1%
Calm 52.7%
Disgusted 45.3%
Surprised 45.1%
Angry 45.9%
Happy 45.1%
Confused 45.2%
Sad 45.8%

AWS Rekognition

Age 35-52
Gender Female, 52%
Happy 47.8%
Confused 45.3%
Calm 50.1%
Surprised 45.8%
Disgusted 45.4%
Sad 45.2%
Angry 45.3%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Calm 49.9%
Surprised 49.5%
Confused 49.6%
Disgusted 49.6%
Happy 49.6%
Sad 49.7%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Calm 49.5%
Sad 50.4%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Sad 50.2%
Happy 49.7%
Confused 49.5%
Disgusted 49.5%
Surprised 49.5%
Angry 49.5%
Calm 49.5%

AWS Rekognition

Age 27-44
Gender Female, 50.5%
Disgusted 49.5%
Sad 50.5%
Confused 49.5%
Calm 49.5%
Angry 49.5%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 10-15
Gender Female, 50.4%
Confused 49.6%
Happy 49.8%
Calm 49.5%
Disgusted 49.6%
Angry 49.6%
Sad 49.8%
Surprised 49.6%

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 59
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 59
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Poster 73%

Categories

Imagga

paintings art 97.3%
text visuals 2.7%

Captions

Microsoft
created on 2018-12-17

a room with pictures on the wall 73.7%
a room with art on the wall 61.1%
a photo of a room 61%

Azure OpenAI

Created on 2024-11-19

The image is a series of nine framed black and white photographs arranged in a three-by-three grid. Each photograph appears to capture a different scene, likely from commercial settings, featuring mannequins, products, and signs that suggest a retail or business environment. Starting from the top left corner and moving to the right, the first photo shows a store display with mannequins dressed in fashion attire. The shop name is partially visible at the top. The second image features an interior scene with clothes on display and a prominent central sign with the name "Kay." In the third photo, a scene includes a pair of individuals standing next to each other. The middle row, from left to right, begins with an outdoor shot of a person seated near a window display. The second image in this row shows a person behind a desk and the text "California City" above, with additional words mentioning a "FREE HOME SITE." The third photo captures two individuals standing behind a counter with a sign reading "Ag-Ro-Matic" above them. The bottom row features three more scenes, starting on the left with an interior setting that includes furniture and decor, with "Bedroom Creations" written at the top. The middle photograph shows two individuals inside a store with a sign that reads "PERNICKETY I" above them. The final photo on the bottom right captures two individuals in a store or market setting with a variety of items visible around them. Each photograph seems to offer a glimpse into different aspects of commerce or daily life, highlighted by the presence of advertising and commercial products. The collection has an artistic or documentary style, capturing scenes that might tell stories about the places and the era in which they were taken.

Anthropic Claude

Created on 2024-11-19

This image appears to be a collage of various black and white photographs depicting different scenes from what seems to be a period in history. The photographs show a variety of people, businesses, and activities, providing a glimpse into the culture and daily life of that time. The images range from a group of women in a shop, to a man sitting at a desk in a storefront, to a group of men posing for a photograph. The overall impression is one of a historical record captured through the lens of a photographer.

Text analysis

Amazon

PERNICKETY
droom
FREE
OSMOSIS
HOME
RO-matic
Phone
283
REVERSE
WATER
droom PERNICKETY I
Creaticn/
FREE HOME DSITE
SYSTEMS
Koy
PURIFICATION
Christmas
Calona Ciuv RO-matic
Ciuv
Aq REVERSE OSMOSIS WATER PURIFICATION SYSTEMS
Bellling
UNWESITY
Calona
Phone 283 OcE T004Y OvLy
UNWESITY Christmas Fair Soecial
DSITE
T004Y
I
Fair
Soecial
OcE
OvLy
KCA RNMT
Aq

Google

Romatic REVERSE OSMOSIS WATER PURIFICATION SYSTEMS FREE HOME SITE room Cregtien PERNICKETY I Christmas Fain Special ¼ OFF TODAY ONLY UNIVERSITY Phone 283.553 venus 79
Romatic
REVERSE
OSMOSIS
WATER
PURIFICATION
SYSTEMS
FREE
HOME
SITE
room
Cregtien
PERNICKETY
I
Christmas
Fain
Special
¼
OFF
TODAY
ONLY
UNIVERSITY
Phone
283.553
venus
79